Feb 17 09:05:21 crc systemd[1]: Starting Kubernetes Kubelet... Feb 17 09:05:21 crc restorecon[4713]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:21 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 09:05:22 crc restorecon[4713]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Feb 17 09:05:22 crc restorecon[4713]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Feb 17 09:05:22 crc kubenswrapper[4848]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 09:05:22 crc kubenswrapper[4848]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Feb 17 09:05:22 crc kubenswrapper[4848]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 09:05:22 crc kubenswrapper[4848]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 09:05:22 crc kubenswrapper[4848]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Feb 17 09:05:22 crc kubenswrapper[4848]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.986681 4848 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992530 4848 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992570 4848 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992583 4848 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992592 4848 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992603 4848 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992614 4848 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992623 4848 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992632 4848 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992640 4848 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992649 4848 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992657 4848 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992665 4848 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992673 4848 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992681 4848 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992689 4848 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992696 4848 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992704 4848 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992712 4848 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992727 4848 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992735 4848 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992742 4848 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992750 4848 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992786 4848 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992794 4848 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992802 4848 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992809 4848 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992818 4848 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992826 4848 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992833 4848 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992841 4848 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992849 4848 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992857 4848 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992865 4848 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992872 4848 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992880 4848 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992890 4848 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992897 4848 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992906 4848 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992916 4848 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992924 4848 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992932 4848 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992939 4848 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992947 4848 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992955 4848 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992963 4848 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992971 4848 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992978 4848 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992986 4848 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.992994 4848 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.993002 4848 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.993009 4848 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.993020 4848 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.993030 4848 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.993038 4848 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.993049 4848 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.993057 4848 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.993066 4848 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.993075 4848 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.993084 4848 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.993092 4848 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.993100 4848 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.993110 4848 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.993120 4848 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.993128 4848 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.993136 4848 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.993146 4848 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.993153 4848 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.993161 4848 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.993170 4848 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.993178 4848 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 09:05:22 crc kubenswrapper[4848]: W0217 09:05:22.993185 4848 feature_gate.go:330] unrecognized feature gate: Example Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994101 4848 flags.go:64] FLAG: --address="0.0.0.0" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994125 4848 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994140 4848 flags.go:64] FLAG: --anonymous-auth="true" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994152 4848 flags.go:64] FLAG: --application-metrics-count-limit="100" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994163 4848 flags.go:64] FLAG: --authentication-token-webhook="false" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994172 4848 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994184 4848 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994195 4848 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994205 4848 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994214 4848 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994224 4848 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994234 4848 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994243 4848 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994253 4848 flags.go:64] FLAG: --cgroup-root="" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994261 4848 flags.go:64] FLAG: --cgroups-per-qos="true" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994270 4848 flags.go:64] FLAG: --client-ca-file="" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994279 4848 flags.go:64] FLAG: --cloud-config="" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994288 4848 flags.go:64] FLAG: --cloud-provider="" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994297 4848 flags.go:64] FLAG: --cluster-dns="[]" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994330 4848 flags.go:64] FLAG: --cluster-domain="" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994340 4848 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994349 4848 flags.go:64] FLAG: --config-dir="" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994358 4848 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994369 4848 flags.go:64] FLAG: --container-log-max-files="5" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994380 4848 flags.go:64] FLAG: --container-log-max-size="10Mi" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994389 4848 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994399 4848 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994409 4848 flags.go:64] FLAG: --containerd-namespace="k8s.io" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994418 4848 flags.go:64] FLAG: --contention-profiling="false" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994428 4848 flags.go:64] FLAG: --cpu-cfs-quota="true" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994437 4848 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994447 4848 flags.go:64] FLAG: --cpu-manager-policy="none" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994455 4848 flags.go:64] FLAG: --cpu-manager-policy-options="" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994466 4848 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994476 4848 flags.go:64] FLAG: --enable-controller-attach-detach="true" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994485 4848 flags.go:64] FLAG: --enable-debugging-handlers="true" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994494 4848 flags.go:64] FLAG: --enable-load-reader="false" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994503 4848 flags.go:64] FLAG: --enable-server="true" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994512 4848 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994530 4848 flags.go:64] FLAG: --event-burst="100" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994539 4848 flags.go:64] FLAG: --event-qps="50" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994549 4848 flags.go:64] FLAG: --event-storage-age-limit="default=0" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994558 4848 flags.go:64] FLAG: --event-storage-event-limit="default=0" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994567 4848 flags.go:64] FLAG: --eviction-hard="" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994577 4848 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994586 4848 flags.go:64] FLAG: --eviction-minimum-reclaim="" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994595 4848 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994605 4848 flags.go:64] FLAG: --eviction-soft="" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994613 4848 flags.go:64] FLAG: --eviction-soft-grace-period="" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994622 4848 flags.go:64] FLAG: --exit-on-lock-contention="false" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994631 4848 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994641 4848 flags.go:64] FLAG: --experimental-mounter-path="" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994650 4848 flags.go:64] FLAG: --fail-cgroupv1="false" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994659 4848 flags.go:64] FLAG: --fail-swap-on="true" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994667 4848 flags.go:64] FLAG: --feature-gates="" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994690 4848 flags.go:64] FLAG: --file-check-frequency="20s" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994700 4848 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994711 4848 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994721 4848 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994730 4848 flags.go:64] FLAG: --healthz-port="10248" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994740 4848 flags.go:64] FLAG: --help="false" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994749 4848 flags.go:64] FLAG: --hostname-override="" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994782 4848 flags.go:64] FLAG: --housekeeping-interval="10s" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994792 4848 flags.go:64] FLAG: --http-check-frequency="20s" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994801 4848 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994810 4848 flags.go:64] FLAG: --image-credential-provider-config="" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994819 4848 flags.go:64] FLAG: --image-gc-high-threshold="85" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994828 4848 flags.go:64] FLAG: --image-gc-low-threshold="80" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994837 4848 flags.go:64] FLAG: --image-service-endpoint="" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994845 4848 flags.go:64] FLAG: --kernel-memcg-notification="false" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994855 4848 flags.go:64] FLAG: --kube-api-burst="100" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994864 4848 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994873 4848 flags.go:64] FLAG: --kube-api-qps="50" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994882 4848 flags.go:64] FLAG: --kube-reserved="" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994892 4848 flags.go:64] FLAG: --kube-reserved-cgroup="" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994901 4848 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994910 4848 flags.go:64] FLAG: --kubelet-cgroups="" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994919 4848 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994927 4848 flags.go:64] FLAG: --lock-file="" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994936 4848 flags.go:64] FLAG: --log-cadvisor-usage="false" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994945 4848 flags.go:64] FLAG: --log-flush-frequency="5s" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994955 4848 flags.go:64] FLAG: --log-json-info-buffer-size="0" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994968 4848 flags.go:64] FLAG: --log-json-split-stream="false" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994977 4848 flags.go:64] FLAG: --log-text-info-buffer-size="0" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994985 4848 flags.go:64] FLAG: --log-text-split-stream="false" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.994995 4848 flags.go:64] FLAG: --logging-format="text" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.995004 4848 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.995014 4848 flags.go:64] FLAG: --make-iptables-util-chains="true" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.995023 4848 flags.go:64] FLAG: --manifest-url="" Feb 17 09:05:22 crc kubenswrapper[4848]: I0217 09:05:22.995040 4848 flags.go:64] FLAG: --manifest-url-header="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995057 4848 flags.go:64] FLAG: --max-housekeeping-interval="15s" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995078 4848 flags.go:64] FLAG: --max-open-files="1000000" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995089 4848 flags.go:64] FLAG: --max-pods="110" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995098 4848 flags.go:64] FLAG: --maximum-dead-containers="-1" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995108 4848 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995117 4848 flags.go:64] FLAG: --memory-manager-policy="None" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995126 4848 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995136 4848 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995145 4848 flags.go:64] FLAG: --node-ip="192.168.126.11" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995154 4848 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995173 4848 flags.go:64] FLAG: --node-status-max-images="50" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995182 4848 flags.go:64] FLAG: --node-status-update-frequency="10s" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995192 4848 flags.go:64] FLAG: --oom-score-adj="-999" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995201 4848 flags.go:64] FLAG: --pod-cidr="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995209 4848 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995222 4848 flags.go:64] FLAG: --pod-manifest-path="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995231 4848 flags.go:64] FLAG: --pod-max-pids="-1" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995240 4848 flags.go:64] FLAG: --pods-per-core="0" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995249 4848 flags.go:64] FLAG: --port="10250" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995258 4848 flags.go:64] FLAG: --protect-kernel-defaults="false" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995267 4848 flags.go:64] FLAG: --provider-id="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995276 4848 flags.go:64] FLAG: --qos-reserved="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995285 4848 flags.go:64] FLAG: --read-only-port="10255" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995294 4848 flags.go:64] FLAG: --register-node="true" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995303 4848 flags.go:64] FLAG: --register-schedulable="true" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995312 4848 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995331 4848 flags.go:64] FLAG: --registry-burst="10" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995341 4848 flags.go:64] FLAG: --registry-qps="5" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995350 4848 flags.go:64] FLAG: --reserved-cpus="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995358 4848 flags.go:64] FLAG: --reserved-memory="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995369 4848 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995378 4848 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995387 4848 flags.go:64] FLAG: --rotate-certificates="false" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995399 4848 flags.go:64] FLAG: --rotate-server-certificates="false" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995408 4848 flags.go:64] FLAG: --runonce="false" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995417 4848 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995426 4848 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995448 4848 flags.go:64] FLAG: --seccomp-default="false" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995457 4848 flags.go:64] FLAG: --serialize-image-pulls="true" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995466 4848 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995475 4848 flags.go:64] FLAG: --storage-driver-db="cadvisor" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995484 4848 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995493 4848 flags.go:64] FLAG: --storage-driver-password="root" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995502 4848 flags.go:64] FLAG: --storage-driver-secure="false" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995511 4848 flags.go:64] FLAG: --storage-driver-table="stats" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995519 4848 flags.go:64] FLAG: --storage-driver-user="root" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995528 4848 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995537 4848 flags.go:64] FLAG: --sync-frequency="1m0s" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995547 4848 flags.go:64] FLAG: --system-cgroups="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995556 4848 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995569 4848 flags.go:64] FLAG: --system-reserved-cgroup="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995578 4848 flags.go:64] FLAG: --tls-cert-file="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995587 4848 flags.go:64] FLAG: --tls-cipher-suites="[]" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995605 4848 flags.go:64] FLAG: --tls-min-version="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995614 4848 flags.go:64] FLAG: --tls-private-key-file="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995623 4848 flags.go:64] FLAG: --topology-manager-policy="none" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995631 4848 flags.go:64] FLAG: --topology-manager-policy-options="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995640 4848 flags.go:64] FLAG: --topology-manager-scope="container" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995650 4848 flags.go:64] FLAG: --v="2" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995661 4848 flags.go:64] FLAG: --version="false" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995673 4848 flags.go:64] FLAG: --vmodule="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995683 4848 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.995692 4848 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996169 4848 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996183 4848 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996194 4848 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996203 4848 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996212 4848 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996222 4848 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996230 4848 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996237 4848 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996245 4848 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996253 4848 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996273 4848 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996281 4848 feature_gate.go:330] unrecognized feature gate: Example Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996290 4848 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996297 4848 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996305 4848 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996313 4848 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996320 4848 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996333 4848 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996343 4848 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996352 4848 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996361 4848 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996371 4848 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996380 4848 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996388 4848 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996396 4848 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996404 4848 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996414 4848 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996422 4848 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996430 4848 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996438 4848 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996448 4848 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996456 4848 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996464 4848 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996474 4848 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996484 4848 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996492 4848 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996501 4848 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996510 4848 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996519 4848 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996527 4848 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996535 4848 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996543 4848 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996551 4848 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996558 4848 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996566 4848 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996574 4848 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996593 4848 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996601 4848 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996609 4848 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996618 4848 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996626 4848 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996634 4848 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996643 4848 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996650 4848 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996658 4848 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996666 4848 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996674 4848 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996684 4848 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996694 4848 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996703 4848 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996712 4848 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996721 4848 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996730 4848 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996739 4848 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996747 4848 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996755 4848 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996786 4848 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996806 4848 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996825 4848 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996833 4848 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:22.996841 4848 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:22.996864 4848 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.021921 4848 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.021973 4848 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022138 4848 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022158 4848 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022168 4848 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022177 4848 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022185 4848 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022194 4848 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022204 4848 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022212 4848 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022220 4848 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022227 4848 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022236 4848 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022244 4848 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022252 4848 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022260 4848 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022268 4848 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022276 4848 feature_gate.go:330] unrecognized feature gate: Example Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022284 4848 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022291 4848 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022299 4848 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022306 4848 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022315 4848 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022322 4848 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022330 4848 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022340 4848 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022350 4848 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022359 4848 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022367 4848 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022375 4848 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022383 4848 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022390 4848 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022398 4848 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022406 4848 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022414 4848 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022421 4848 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022429 4848 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022437 4848 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022445 4848 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022453 4848 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022461 4848 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022469 4848 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022476 4848 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022484 4848 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022493 4848 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022501 4848 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022509 4848 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022517 4848 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022526 4848 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022534 4848 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022542 4848 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022551 4848 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022561 4848 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022572 4848 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022581 4848 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022589 4848 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022597 4848 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022604 4848 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022612 4848 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022620 4848 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022630 4848 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022640 4848 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022649 4848 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022658 4848 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022666 4848 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022674 4848 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022683 4848 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022691 4848 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022699 4848 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022707 4848 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022715 4848 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022723 4848 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.022732 4848 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.022746 4848 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023002 4848 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023018 4848 feature_gate.go:330] unrecognized feature gate: NewOLM Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023031 4848 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023042 4848 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023053 4848 feature_gate.go:330] unrecognized feature gate: PlatformOperators Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023062 4848 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023070 4848 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023078 4848 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023087 4848 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023094 4848 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023103 4848 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023110 4848 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023118 4848 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023126 4848 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023134 4848 feature_gate.go:330] unrecognized feature gate: SignatureStores Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023142 4848 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023151 4848 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023158 4848 feature_gate.go:330] unrecognized feature gate: InsightsConfig Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023166 4848 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023174 4848 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023185 4848 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023196 4848 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023207 4848 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023217 4848 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023227 4848 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023236 4848 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023245 4848 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023253 4848 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023261 4848 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023271 4848 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023280 4848 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023289 4848 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023297 4848 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023305 4848 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023313 4848 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023320 4848 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023329 4848 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023339 4848 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023347 4848 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023355 4848 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023362 4848 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023370 4848 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023378 4848 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023386 4848 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023393 4848 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023401 4848 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023409 4848 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023417 4848 feature_gate.go:330] unrecognized feature gate: Example Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023424 4848 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023432 4848 feature_gate.go:330] unrecognized feature gate: PinnedImages Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023440 4848 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023451 4848 feature_gate.go:330] unrecognized feature gate: GatewayAPI Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023459 4848 feature_gate.go:330] unrecognized feature gate: OVNObservability Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023467 4848 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023475 4848 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023483 4848 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023490 4848 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023499 4848 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023506 4848 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023514 4848 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023522 4848 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023529 4848 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023539 4848 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023550 4848 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023560 4848 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023568 4848 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023576 4848 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023584 4848 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023593 4848 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023601 4848 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.023608 4848 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.023621 4848 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.024933 4848 server.go:940] "Client rotation is on, will bootstrap in background" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.051369 4848 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.051463 4848 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.053668 4848 server.go:997] "Starting client certificate rotation" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.053694 4848 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.055464 4848 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-02 09:00:15.667155898 +0000 UTC Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.055510 4848 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.090888 4848 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.094219 4848 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 09:05:23 crc kubenswrapper[4848]: E0217 09:05:23.115459 4848 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.137549 4848 log.go:25] "Validated CRI v1 runtime API" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.252034 4848 log.go:25] "Validated CRI v1 image API" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.254393 4848 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.260697 4848 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-02-17-09-00-31-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.260745 4848 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:41 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.284530 4848 manager.go:217] Machine: {Timestamp:2026-02-17 09:05:23.281262409 +0000 UTC m=+0.824518125 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:8c2b9c04-4f3d-4d42-8125-29db61982ba4 BootID:1f8980ef-de02-4b2d-9798-0aff268a6b81 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:41 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:d0:43:26 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:d0:43:26 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:18:a6:48 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:a6:e6:fd Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:56:57:94 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:1c:b9:05 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ee:1b:fe:b0:c1:4f Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:22:7b:81:6e:c6:41 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.284938 4848 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.285122 4848 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.287277 4848 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.287583 4848 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.287641 4848 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.287968 4848 topology_manager.go:138] "Creating topology manager with none policy" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.287985 4848 container_manager_linux.go:303] "Creating device plugin manager" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.288538 4848 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.288591 4848 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.288823 4848 state_mem.go:36] "Initialized new in-memory state store" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.288954 4848 server.go:1245] "Using root directory" path="/var/lib/kubelet" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.295028 4848 kubelet.go:418] "Attempting to sync node with API server" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.295066 4848 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.295137 4848 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.295157 4848 kubelet.go:324] "Adding apiserver pod source" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.295197 4848 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.301168 4848 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.302106 4848 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.302187 4848 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 17 09:05:23 crc kubenswrapper[4848]: E0217 09:05:23.302237 4848 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Feb 17 09:05:23 crc kubenswrapper[4848]: E0217 09:05:23.302262 4848 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.302448 4848 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.307147 4848 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.308899 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.308946 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.308963 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.308977 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.308999 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.309012 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.309026 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.309048 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.309064 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.309078 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.309096 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.309110 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.310359 4848 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.311035 4848 server.go:1280] "Started kubelet" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.311945 4848 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.312026 4848 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.312022 4848 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.312815 4848 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Feb 17 09:05:23 crc systemd[1]: Started Kubernetes Kubelet. Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.315337 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.315389 4848 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.315431 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 05:53:19.09458664 +0000 UTC Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.315594 4848 volume_manager.go:287] "The desired_state_of_world populator starts" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.315638 4848 volume_manager.go:289] "Starting Kubelet Volume Manager" Feb 17 09:05:23 crc kubenswrapper[4848]: E0217 09:05:23.315688 4848 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.315800 4848 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.316093 4848 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 17 09:05:23 crc kubenswrapper[4848]: E0217 09:05:23.316210 4848 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Feb 17 09:05:23 crc kubenswrapper[4848]: E0217 09:05:23.316298 4848 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="200ms" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.316507 4848 factory.go:55] Registering systemd factory Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.316526 4848 factory.go:221] Registration of the systemd container factory successfully Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.316840 4848 factory.go:153] Registering CRI-O factory Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.316865 4848 factory.go:221] Registration of the crio container factory successfully Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.316974 4848 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.316999 4848 factory.go:103] Registering Raw factory Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.317020 4848 manager.go:1196] Started watching for new ooms in manager Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.317223 4848 server.go:460] "Adding debug handlers to kubelet server" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.317705 4848 manager.go:319] Starting recovery of all containers Feb 17 09:05:23 crc kubenswrapper[4848]: E0217 09:05:23.320419 4848 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.69:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1894fd5b98cee2f7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 09:05:23.310994167 +0000 UTC m=+0.854249843,LastTimestamp:2026-02-17 09:05:23.310994167 +0000 UTC m=+0.854249843,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.330831 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.331409 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.331466 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.331491 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.331524 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.331547 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.331576 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.331598 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.331630 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.331653 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.331674 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.331702 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.331736 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.331810 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.331835 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.331864 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.331887 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.331907 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.331932 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.331953 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.331979 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332001 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332023 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332050 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332072 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332101 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332129 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332196 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332224 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332258 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332286 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332378 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332402 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332423 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332450 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332471 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332503 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332523 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332544 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332570 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332590 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332618 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332638 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332658 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332684 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332703 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332732 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332752 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332801 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332835 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332868 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332888 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332921 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332966 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.332997 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333019 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333046 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333068 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333096 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333116 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333138 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333162 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333183 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333209 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333230 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333251 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333277 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333298 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333324 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333343 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333435 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333547 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333569 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333596 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333622 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333648 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333676 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333703 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333781 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333839 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333899 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333933 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.333991 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.334018 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.334051 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.334162 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.334189 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.334272 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.334395 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.334428 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.334447 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.334478 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.334498 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.334516 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.334578 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.334606 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.334732 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.334788 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.334808 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.334840 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.334866 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.334892 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.334910 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.334930 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.335155 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.335187 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.335271 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.335300 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.335329 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.335359 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.336054 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.336178 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.336231 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.336290 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.336329 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.336375 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.336417 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.336464 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.336690 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.336727 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.336832 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.337124 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.337296 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.337433 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.337553 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.337696 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.337845 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.337946 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.337972 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.338003 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.338097 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.338122 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.338149 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.338434 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.338524 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.339690 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.339742 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.339787 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.345996 4848 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346074 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346110 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346141 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346173 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346198 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346222 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346249 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346275 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346301 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346331 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346368 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346398 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346425 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346476 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346508 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346534 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346558 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346598 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346630 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346659 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346685 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346709 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346736 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346817 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346858 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346884 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346914 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346944 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.346969 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347015 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347085 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347115 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347141 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347166 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347191 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347217 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347246 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347282 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347309 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347347 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347381 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347408 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347437 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347472 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347499 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347527 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347555 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347582 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347618 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347645 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347673 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347699 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347739 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347798 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347827 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347855 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347880 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347910 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347937 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347971 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.347999 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.348025 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.348053 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.348078 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.348103 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.348133 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.348197 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.348223 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.348251 4848 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.348277 4848 reconstruct.go:97] "Volume reconstruction finished" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.348293 4848 reconciler.go:26] "Reconciler: start to sync state" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.351954 4848 manager.go:324] Recovery completed Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.367180 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.368942 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.369070 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.369084 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.369927 4848 cpu_manager.go:225] "Starting CPU manager" policy="none" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.369943 4848 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.369962 4848 state_mem.go:36] "Initialized new in-memory state store" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.371686 4848 policy_none.go:49] "None policy: Start" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.372637 4848 memory_manager.go:170] "Starting memorymanager" policy="None" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.372665 4848 state_mem.go:35] "Initializing new in-memory state store" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.377068 4848 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.381684 4848 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.381807 4848 status_manager.go:217] "Starting to sync pod status with apiserver" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.382113 4848 kubelet.go:2335] "Starting kubelet main sync loop" Feb 17 09:05:23 crc kubenswrapper[4848]: E0217 09:05:23.382206 4848 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.383550 4848 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 17 09:05:23 crc kubenswrapper[4848]: E0217 09:05:23.383696 4848 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Feb 17 09:05:23 crc kubenswrapper[4848]: E0217 09:05:23.416456 4848 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.436075 4848 manager.go:334] "Starting Device Plugin manager" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.436170 4848 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.436191 4848 server.go:79] "Starting device plugin registration server" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.436897 4848 eviction_manager.go:189] "Eviction manager: starting control loop" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.436961 4848 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.437082 4848 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.437193 4848 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.437206 4848 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Feb 17 09:05:23 crc kubenswrapper[4848]: E0217 09:05:23.445615 4848 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.482504 4848 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.482584 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.484103 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.484170 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.484198 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.484414 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.484805 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.484888 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.485688 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.485743 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.485754 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.485930 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.486018 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.486048 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.486065 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.486111 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.486162 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.486997 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.487034 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.487052 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.487353 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.487411 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.487462 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.487474 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.487495 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.487498 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.488627 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.488665 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.488686 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.488720 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.488784 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.488809 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.489137 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.489377 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.489459 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.490439 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.490489 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.490512 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.490969 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.491071 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.490987 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.491130 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.491182 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.494646 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.494703 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.494728 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:23 crc kubenswrapper[4848]: E0217 09:05:23.517305 4848 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="400ms" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.537098 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.538425 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.538472 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.538482 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.538516 4848 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 09:05:23 crc kubenswrapper[4848]: E0217 09:05:23.539033 4848 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.69:6443: connect: connection refused" node="crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.550940 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.550996 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.551354 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.551445 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.551495 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.551637 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.551719 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.551786 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.551827 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.552196 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.552256 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.552302 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.552382 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.552462 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.552516 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.653741 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.653877 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.653928 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.653973 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654020 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654065 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654095 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654109 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654082 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654172 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654167 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654188 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654213 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654061 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654230 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654060 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654250 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654249 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654283 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654114 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654264 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654333 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654382 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654431 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654459 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654471 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654527 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654595 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654266 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.654658 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.740095 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.742071 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.742129 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.742147 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.742188 4848 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 09:05:23 crc kubenswrapper[4848]: E0217 09:05:23.742945 4848 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.69:6443: connect: connection refused" node="crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.835874 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.846867 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.876542 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.886938 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-c3799f1bfab1b87a75d72ed4bfd0fb31f26e8562bcdd1f1d2b5b5228d882551f WatchSource:0}: Error finding container c3799f1bfab1b87a75d72ed4bfd0fb31f26e8562bcdd1f1d2b5b5228d882551f: Status 404 returned error can't find the container with id c3799f1bfab1b87a75d72ed4bfd0fb31f26e8562bcdd1f1d2b5b5228d882551f Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.894493 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-3088e33b192cba6b8c9e46aca6502b62259bfa34a962539f8dceeb52a4e6d734 WatchSource:0}: Error finding container 3088e33b192cba6b8c9e46aca6502b62259bfa34a962539f8dceeb52a4e6d734: Status 404 returned error can't find the container with id 3088e33b192cba6b8c9e46aca6502b62259bfa34a962539f8dceeb52a4e6d734 Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.897138 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: I0217 09:05:23.907051 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.907899 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-067d8df3bbfb96c678e840ad616a3c47117e36dcc13dba8d6453375dfe466052 WatchSource:0}: Error finding container 067d8df3bbfb96c678e840ad616a3c47117e36dcc13dba8d6453375dfe466052: Status 404 returned error can't find the container with id 067d8df3bbfb96c678e840ad616a3c47117e36dcc13dba8d6453375dfe466052 Feb 17 09:05:23 crc kubenswrapper[4848]: E0217 09:05:23.919086 4848 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="800ms" Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.921332 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-8df17ea00d9803c74bdc6ef7091844949c7800cbefba0e919b39e69427ac796c WatchSource:0}: Error finding container 8df17ea00d9803c74bdc6ef7091844949c7800cbefba0e919b39e69427ac796c: Status 404 returned error can't find the container with id 8df17ea00d9803c74bdc6ef7091844949c7800cbefba0e919b39e69427ac796c Feb 17 09:05:23 crc kubenswrapper[4848]: W0217 09:05:23.942632 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-8329ae00ff22bb638da8a265df2970334626d8a6d1281e7b11a5947dfdcba0fb WatchSource:0}: Error finding container 8329ae00ff22bb638da8a265df2970334626d8a6d1281e7b11a5947dfdcba0fb: Status 404 returned error can't find the container with id 8329ae00ff22bb638da8a265df2970334626d8a6d1281e7b11a5947dfdcba0fb Feb 17 09:05:24 crc kubenswrapper[4848]: I0217 09:05:24.143808 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:24 crc kubenswrapper[4848]: I0217 09:05:24.145161 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:24 crc kubenswrapper[4848]: I0217 09:05:24.145205 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:24 crc kubenswrapper[4848]: I0217 09:05:24.145221 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:24 crc kubenswrapper[4848]: I0217 09:05:24.145255 4848 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 09:05:24 crc kubenswrapper[4848]: E0217 09:05:24.145866 4848 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.69:6443: connect: connection refused" node="crc" Feb 17 09:05:24 crc kubenswrapper[4848]: W0217 09:05:24.165968 4848 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 17 09:05:24 crc kubenswrapper[4848]: E0217 09:05:24.166047 4848 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Feb 17 09:05:24 crc kubenswrapper[4848]: W0217 09:05:24.268971 4848 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 17 09:05:24 crc kubenswrapper[4848]: E0217 09:05:24.269079 4848 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Feb 17 09:05:24 crc kubenswrapper[4848]: I0217 09:05:24.313609 4848 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 17 09:05:24 crc kubenswrapper[4848]: I0217 09:05:24.315614 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 05:04:57.55457279 +0000 UTC Feb 17 09:05:24 crc kubenswrapper[4848]: I0217 09:05:24.387049 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"8329ae00ff22bb638da8a265df2970334626d8a6d1281e7b11a5947dfdcba0fb"} Feb 17 09:05:24 crc kubenswrapper[4848]: I0217 09:05:24.388525 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8df17ea00d9803c74bdc6ef7091844949c7800cbefba0e919b39e69427ac796c"} Feb 17 09:05:24 crc kubenswrapper[4848]: I0217 09:05:24.390421 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"067d8df3bbfb96c678e840ad616a3c47117e36dcc13dba8d6453375dfe466052"} Feb 17 09:05:24 crc kubenswrapper[4848]: I0217 09:05:24.391398 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"c3799f1bfab1b87a75d72ed4bfd0fb31f26e8562bcdd1f1d2b5b5228d882551f"} Feb 17 09:05:24 crc kubenswrapper[4848]: I0217 09:05:24.392298 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"3088e33b192cba6b8c9e46aca6502b62259bfa34a962539f8dceeb52a4e6d734"} Feb 17 09:05:24 crc kubenswrapper[4848]: E0217 09:05:24.720391 4848 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="1.6s" Feb 17 09:05:24 crc kubenswrapper[4848]: W0217 09:05:24.789483 4848 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 17 09:05:24 crc kubenswrapper[4848]: E0217 09:05:24.789878 4848 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Feb 17 09:05:24 crc kubenswrapper[4848]: W0217 09:05:24.894712 4848 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 17 09:05:24 crc kubenswrapper[4848]: E0217 09:05:24.894878 4848 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Feb 17 09:05:24 crc kubenswrapper[4848]: I0217 09:05:24.946606 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:24 crc kubenswrapper[4848]: I0217 09:05:24.948604 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:24 crc kubenswrapper[4848]: I0217 09:05:24.948687 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:24 crc kubenswrapper[4848]: I0217 09:05:24.948711 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:24 crc kubenswrapper[4848]: I0217 09:05:24.948790 4848 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 09:05:24 crc kubenswrapper[4848]: E0217 09:05:24.949479 4848 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.69:6443: connect: connection refused" node="crc" Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.186630 4848 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 09:05:25 crc kubenswrapper[4848]: E0217 09:05:25.187946 4848 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.69:6443: connect: connection refused" logger="UnhandledError" Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.313384 4848 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.316531 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 17:27:41.32533128 +0000 UTC Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.401387 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33"} Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.401480 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f"} Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.401496 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.401501 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20"} Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.401653 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064"} Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.403280 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.403348 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.403376 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.404813 4848 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0" exitCode=0 Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.405023 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0"} Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.405073 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.406882 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.406952 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.406977 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.407726 4848 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7fb53737be5ff4203f1364b05c1d070f2b64c59ecfc936d921baae95450acc77" exitCode=0 Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.407810 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7fb53737be5ff4203f1364b05c1d070f2b64c59ecfc936d921baae95450acc77"} Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.407952 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.409324 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.409326 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.409486 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.409522 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.410698 4848 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="0f5c4d217c8f3b953d77abd40df91330c582863ededd1f35f16fd05f187c9968" exitCode=0 Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.410800 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.410747 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"0f5c4d217c8f3b953d77abd40df91330c582863ededd1f35f16fd05f187c9968"} Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.410984 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.411079 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.411105 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.412118 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.412136 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.412147 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.414457 4848 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="12f6b874bf0e3bb786d81941053c5c37b47f5cdbd2502e8da344de2de288ae5e" exitCode=0 Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.414591 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.414613 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"12f6b874bf0e3bb786d81941053c5c37b47f5cdbd2502e8da344de2de288ae5e"} Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.416626 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.416687 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:25 crc kubenswrapper[4848]: I0217 09:05:25.416710 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.313250 4848 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.69:6443: connect: connection refused Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.316840 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 19:10:10.586984612 +0000 UTC Feb 17 09:05:26 crc kubenswrapper[4848]: E0217 09:05:26.321441 4848 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="3.2s" Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.421143 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.421594 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"8ea9449101d695309f91bf3b9e7639d6451b3ef988b42658cc43411c83148857"} Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.421632 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"1eacf65ee572ba8ca73ed4cc5368d381803152a667577f11c4f91a7dc763256a"} Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.421645 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ca3c8c541c67663ceec2edf95439371077c87b4040a0ce7656c8bf72d601cd8a"} Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.422456 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.422495 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.422506 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.426375 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf"} Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.426414 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692"} Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.426424 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879"} Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.426433 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6"} Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.428162 4848 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c3427f712b264815c5148edafb4ff7186faccbf0bccce9512d6a4df6cc57a339" exitCode=0 Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.428256 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c3427f712b264815c5148edafb4ff7186faccbf0bccce9512d6a4df6cc57a339"} Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.428312 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.429616 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.429659 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.429676 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.430328 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.430366 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"55cee3e3bf4490cd3a68bdfb2f8b6992818301a6b7e0c8c9ece874ac078810a9"} Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.430844 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.431527 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.431556 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.431566 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.432273 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.432295 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.432306 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.550446 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.551651 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.551680 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.551692 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:26 crc kubenswrapper[4848]: I0217 09:05:26.551714 4848 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 09:05:26 crc kubenswrapper[4848]: E0217 09:05:26.552095 4848 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.69:6443: connect: connection refused" node="crc" Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.070107 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.316975 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 12:06:25.565706602 +0000 UTC Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.437275 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7"} Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.437307 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.438524 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.438574 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.438590 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.441052 4848 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="c9bd5a21d79a7b9825b15552676a11c4c75c8eba4654d47508452ad923587b7d" exitCode=0 Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.441130 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"c9bd5a21d79a7b9825b15552676a11c4c75c8eba4654d47508452ad923587b7d"} Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.441204 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.441222 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.441257 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.441281 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.441131 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.442628 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.442680 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.442695 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.442681 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.442732 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.442750 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.442796 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.442835 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.442853 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.442991 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.443216 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:27 crc kubenswrapper[4848]: I0217 09:05:27.443231 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:28 crc kubenswrapper[4848]: I0217 09:05:28.317741 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 20:29:54.016689019 +0000 UTC Feb 17 09:05:28 crc kubenswrapper[4848]: I0217 09:05:28.351221 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:05:28 crc kubenswrapper[4848]: I0217 09:05:28.451059 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e8ff0c548222abded2a2a1dc81cbe500edabcca08aa10f362b45f9eee2953d43"} Feb 17 09:05:28 crc kubenswrapper[4848]: I0217 09:05:28.451115 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6d40404ebda7101d913e0dded89797c76100db568074a844454b0bad88c8925d"} Feb 17 09:05:28 crc kubenswrapper[4848]: I0217 09:05:28.451138 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"101d3f4aaeb11bb146cc357ca24ae8f12cda88dc9088351c4c0765be56f91151"} Feb 17 09:05:28 crc kubenswrapper[4848]: I0217 09:05:28.451182 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:28 crc kubenswrapper[4848]: I0217 09:05:28.452339 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:28 crc kubenswrapper[4848]: I0217 09:05:28.452372 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:28 crc kubenswrapper[4848]: I0217 09:05:28.452381 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:29 crc kubenswrapper[4848]: I0217 09:05:29.304675 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:05:29 crc kubenswrapper[4848]: I0217 09:05:29.318723 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 03:33:57.684037767 +0000 UTC Feb 17 09:05:29 crc kubenswrapper[4848]: I0217 09:05:29.457178 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:29 crc kubenswrapper[4848]: I0217 09:05:29.457846 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c24770f056283550f0ae425d5bf2150923ac3a06f35b68b0a2d18ca7c28e7aed"} Feb 17 09:05:29 crc kubenswrapper[4848]: I0217 09:05:29.457886 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"df6bf73bfc6b519f2acff6ae6b29373b167447e17c560d0bd087d0278cc8fa5c"} Feb 17 09:05:29 crc kubenswrapper[4848]: I0217 09:05:29.457929 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:29 crc kubenswrapper[4848]: I0217 09:05:29.458284 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:29 crc kubenswrapper[4848]: I0217 09:05:29.458311 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:29 crc kubenswrapper[4848]: I0217 09:05:29.458322 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:29 crc kubenswrapper[4848]: I0217 09:05:29.458714 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:29 crc kubenswrapper[4848]: I0217 09:05:29.458778 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:29 crc kubenswrapper[4848]: I0217 09:05:29.458793 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:29 crc kubenswrapper[4848]: I0217 09:05:29.479106 4848 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Feb 17 09:05:29 crc kubenswrapper[4848]: I0217 09:05:29.632967 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Feb 17 09:05:29 crc kubenswrapper[4848]: I0217 09:05:29.752532 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:29 crc kubenswrapper[4848]: I0217 09:05:29.754566 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:29 crc kubenswrapper[4848]: I0217 09:05:29.754616 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:29 crc kubenswrapper[4848]: I0217 09:05:29.754636 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:29 crc kubenswrapper[4848]: I0217 09:05:29.754668 4848 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 09:05:29 crc kubenswrapper[4848]: I0217 09:05:29.997041 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 09:05:29 crc kubenswrapper[4848]: I0217 09:05:29.997318 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:29 crc kubenswrapper[4848]: I0217 09:05:29.999132 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:29 crc kubenswrapper[4848]: I0217 09:05:29.999197 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:29 crc kubenswrapper[4848]: I0217 09:05:29.999226 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:30 crc kubenswrapper[4848]: I0217 09:05:30.059057 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:05:30 crc kubenswrapper[4848]: I0217 09:05:30.319856 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 01:42:35.058197452 +0000 UTC Feb 17 09:05:30 crc kubenswrapper[4848]: I0217 09:05:30.460546 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:30 crc kubenswrapper[4848]: I0217 09:05:30.460609 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:30 crc kubenswrapper[4848]: I0217 09:05:30.462257 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:30 crc kubenswrapper[4848]: I0217 09:05:30.462313 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:30 crc kubenswrapper[4848]: I0217 09:05:30.462331 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:30 crc kubenswrapper[4848]: I0217 09:05:30.462383 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:30 crc kubenswrapper[4848]: I0217 09:05:30.462424 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:30 crc kubenswrapper[4848]: I0217 09:05:30.462447 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:30 crc kubenswrapper[4848]: I0217 09:05:30.602248 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 09:05:30 crc kubenswrapper[4848]: I0217 09:05:30.602477 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:30 crc kubenswrapper[4848]: I0217 09:05:30.604394 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:30 crc kubenswrapper[4848]: I0217 09:05:30.604446 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:30 crc kubenswrapper[4848]: I0217 09:05:30.604464 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:31 crc kubenswrapper[4848]: I0217 09:05:31.136838 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 09:05:31 crc kubenswrapper[4848]: I0217 09:05:31.137079 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:31 crc kubenswrapper[4848]: I0217 09:05:31.138574 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:31 crc kubenswrapper[4848]: I0217 09:05:31.138636 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:31 crc kubenswrapper[4848]: I0217 09:05:31.138661 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:31 crc kubenswrapper[4848]: I0217 09:05:31.320690 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 09:32:25.825211018 +0000 UTC Feb 17 09:05:31 crc kubenswrapper[4848]: I0217 09:05:31.463085 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:31 crc kubenswrapper[4848]: I0217 09:05:31.463170 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:31 crc kubenswrapper[4848]: I0217 09:05:31.464132 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:31 crc kubenswrapper[4848]: I0217 09:05:31.464175 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:31 crc kubenswrapper[4848]: I0217 09:05:31.464187 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:31 crc kubenswrapper[4848]: I0217 09:05:31.464398 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:31 crc kubenswrapper[4848]: I0217 09:05:31.464433 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:31 crc kubenswrapper[4848]: I0217 09:05:31.464444 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:32 crc kubenswrapper[4848]: I0217 09:05:32.321188 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 05:20:43.993496057 +0000 UTC Feb 17 09:05:32 crc kubenswrapper[4848]: I0217 09:05:32.403287 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Feb 17 09:05:32 crc kubenswrapper[4848]: I0217 09:05:32.413846 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 09:05:32 crc kubenswrapper[4848]: I0217 09:05:32.414059 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:32 crc kubenswrapper[4848]: I0217 09:05:32.415423 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:32 crc kubenswrapper[4848]: I0217 09:05:32.415471 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:32 crc kubenswrapper[4848]: I0217 09:05:32.415488 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:32 crc kubenswrapper[4848]: I0217 09:05:32.421854 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 09:05:32 crc kubenswrapper[4848]: I0217 09:05:32.465315 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:32 crc kubenswrapper[4848]: I0217 09:05:32.465492 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:32 crc kubenswrapper[4848]: I0217 09:05:32.466333 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:32 crc kubenswrapper[4848]: I0217 09:05:32.466413 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:32 crc kubenswrapper[4848]: I0217 09:05:32.466437 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:32 crc kubenswrapper[4848]: I0217 09:05:32.467093 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:32 crc kubenswrapper[4848]: I0217 09:05:32.467217 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:32 crc kubenswrapper[4848]: I0217 09:05:32.467344 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:33 crc kubenswrapper[4848]: I0217 09:05:33.321637 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 07:30:38.379140399 +0000 UTC Feb 17 09:05:33 crc kubenswrapper[4848]: E0217 09:05:33.446505 4848 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 17 09:05:33 crc kubenswrapper[4848]: I0217 09:05:33.603206 4848 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 09:05:33 crc kubenswrapper[4848]: I0217 09:05:33.603299 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 09:05:34 crc kubenswrapper[4848]: I0217 09:05:34.321947 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 05:13:23.794583539 +0000 UTC Feb 17 09:05:35 crc kubenswrapper[4848]: I0217 09:05:35.322810 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 20:10:32.026791829 +0000 UTC Feb 17 09:05:36 crc kubenswrapper[4848]: I0217 09:05:36.323187 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 00:39:30.242513754 +0000 UTC Feb 17 09:05:36 crc kubenswrapper[4848]: W0217 09:05:36.892514 4848 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 17 09:05:36 crc kubenswrapper[4848]: I0217 09:05:36.892639 4848 trace.go:236] Trace[743997592]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 09:05:26.890) (total time: 10001ms): Feb 17 09:05:36 crc kubenswrapper[4848]: Trace[743997592]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (09:05:36.892) Feb 17 09:05:36 crc kubenswrapper[4848]: Trace[743997592]: [10.001722312s] [10.001722312s] END Feb 17 09:05:36 crc kubenswrapper[4848]: E0217 09:05:36.892669 4848 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 17 09:05:36 crc kubenswrapper[4848]: W0217 09:05:36.959673 4848 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 17 09:05:36 crc kubenswrapper[4848]: I0217 09:05:36.959803 4848 trace.go:236] Trace[1259006034]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 09:05:26.957) (total time: 10001ms): Feb 17 09:05:36 crc kubenswrapper[4848]: Trace[1259006034]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (09:05:36.959) Feb 17 09:05:36 crc kubenswrapper[4848]: Trace[1259006034]: [10.001924062s] [10.001924062s] END Feb 17 09:05:36 crc kubenswrapper[4848]: E0217 09:05:36.959827 4848 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 17 09:05:36 crc kubenswrapper[4848]: W0217 09:05:36.960729 4848 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 17 09:05:36 crc kubenswrapper[4848]: I0217 09:05:36.960825 4848 trace.go:236] Trace[970622377]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 09:05:26.959) (total time: 10001ms): Feb 17 09:05:36 crc kubenswrapper[4848]: Trace[970622377]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (09:05:36.960) Feb 17 09:05:36 crc kubenswrapper[4848]: Trace[970622377]: [10.001283003s] [10.001283003s] END Feb 17 09:05:36 crc kubenswrapper[4848]: E0217 09:05:36.960847 4848 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 17 09:05:37 crc kubenswrapper[4848]: I0217 09:05:37.077341 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 09:05:37 crc kubenswrapper[4848]: I0217 09:05:37.077499 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:37 crc kubenswrapper[4848]: I0217 09:05:37.079257 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:37 crc kubenswrapper[4848]: I0217 09:05:37.079294 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:37 crc kubenswrapper[4848]: I0217 09:05:37.079303 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:37 crc kubenswrapper[4848]: I0217 09:05:37.314687 4848 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Feb 17 09:05:37 crc kubenswrapper[4848]: I0217 09:05:37.324161 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 19:58:03.323656181 +0000 UTC Feb 17 09:05:37 crc kubenswrapper[4848]: W0217 09:05:37.512860 4848 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Feb 17 09:05:37 crc kubenswrapper[4848]: I0217 09:05:37.512991 4848 trace.go:236] Trace[2015313889]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (17-Feb-2026 09:05:27.511) (total time: 10001ms): Feb 17 09:05:37 crc kubenswrapper[4848]: Trace[2015313889]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (09:05:37.512) Feb 17 09:05:37 crc kubenswrapper[4848]: Trace[2015313889]: [10.001288147s] [10.001288147s] END Feb 17 09:05:37 crc kubenswrapper[4848]: E0217 09:05:37.513026 4848 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Feb 17 09:05:37 crc kubenswrapper[4848]: I0217 09:05:37.830839 4848 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 17 09:05:37 crc kubenswrapper[4848]: I0217 09:05:37.830964 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 17 09:05:37 crc kubenswrapper[4848]: I0217 09:05:37.852942 4848 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 17 09:05:37 crc kubenswrapper[4848]: I0217 09:05:37.853037 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Feb 17 09:05:38 crc kubenswrapper[4848]: I0217 09:05:38.325212 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 15:35:56.078337766 +0000 UTC Feb 17 09:05:39 crc kubenswrapper[4848]: I0217 09:05:39.313382 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:05:39 crc kubenswrapper[4848]: I0217 09:05:39.313591 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:39 crc kubenswrapper[4848]: I0217 09:05:39.315376 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:39 crc kubenswrapper[4848]: I0217 09:05:39.315436 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:39 crc kubenswrapper[4848]: I0217 09:05:39.315453 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:39 crc kubenswrapper[4848]: I0217 09:05:39.320981 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:05:39 crc kubenswrapper[4848]: I0217 09:05:39.325946 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 21:17:30.707935885 +0000 UTC Feb 17 09:05:39 crc kubenswrapper[4848]: I0217 09:05:39.486877 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 09:05:39 crc kubenswrapper[4848]: I0217 09:05:39.486945 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:39 crc kubenswrapper[4848]: I0217 09:05:39.488118 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:39 crc kubenswrapper[4848]: I0217 09:05:39.488167 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:39 crc kubenswrapper[4848]: I0217 09:05:39.488184 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:39 crc kubenswrapper[4848]: I0217 09:05:39.671122 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Feb 17 09:05:39 crc kubenswrapper[4848]: I0217 09:05:39.671410 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:39 crc kubenswrapper[4848]: I0217 09:05:39.673126 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:39 crc kubenswrapper[4848]: I0217 09:05:39.673171 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:39 crc kubenswrapper[4848]: I0217 09:05:39.673198 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:39 crc kubenswrapper[4848]: I0217 09:05:39.690158 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Feb 17 09:05:40 crc kubenswrapper[4848]: I0217 09:05:40.326357 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 15:52:26.152231445 +0000 UTC Feb 17 09:05:40 crc kubenswrapper[4848]: I0217 09:05:40.409852 4848 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 09:05:40 crc kubenswrapper[4848]: I0217 09:05:40.489803 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:40 crc kubenswrapper[4848]: I0217 09:05:40.491244 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:40 crc kubenswrapper[4848]: I0217 09:05:40.491444 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:40 crc kubenswrapper[4848]: I0217 09:05:40.491578 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:41 crc kubenswrapper[4848]: I0217 09:05:41.327116 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 16:36:15.6729898 +0000 UTC Feb 17 09:05:42 crc kubenswrapper[4848]: I0217 09:05:42.212895 4848 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 17 09:05:42 crc kubenswrapper[4848]: I0217 09:05:42.327598 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 08:05:58.180413009 +0000 UTC Feb 17 09:05:42 crc kubenswrapper[4848]: I0217 09:05:42.557025 4848 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 09:05:42 crc kubenswrapper[4848]: E0217 09:05:42.831457 4848 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Feb 17 09:05:42 crc kubenswrapper[4848]: I0217 09:05:42.837866 4848 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 17 09:05:42 crc kubenswrapper[4848]: E0217 09:05:42.838394 4848 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Feb 17 09:05:42 crc kubenswrapper[4848]: I0217 09:05:42.842122 4848 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 17 09:05:42 crc kubenswrapper[4848]: I0217 09:05:42.870867 4848 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 09:05:42 crc kubenswrapper[4848]: I0217 09:05:42.898091 4848 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:51426->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 17 09:05:42 crc kubenswrapper[4848]: I0217 09:05:42.898128 4848 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:51432->192.168.126.11:17697: read: connection reset by peer" start-of-body= Feb 17 09:05:42 crc kubenswrapper[4848]: I0217 09:05:42.898156 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:51426->192.168.126.11:17697: read: connection reset by peer" Feb 17 09:05:42 crc kubenswrapper[4848]: I0217 09:05:42.898243 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:51432->192.168.126.11:17697: read: connection reset by peer" Feb 17 09:05:42 crc kubenswrapper[4848]: I0217 09:05:42.899221 4848 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Feb 17 09:05:42 crc kubenswrapper[4848]: I0217 09:05:42.899262 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.306573 4848 apiserver.go:52] "Watching apiserver" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.309721 4848 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.310057 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.310581 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.310676 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.310810 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.313698 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.313629 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.314052 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.314033 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.314243 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.314525 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.318345 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.318496 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.318527 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.318621 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.318688 4848 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.318868 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.319065 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.319357 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.319365 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.320955 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.328096 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 05:40:20.815290371 +0000 UTC Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.341012 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.341092 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.341145 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.341191 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.341240 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.341284 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.341350 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.341396 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.341446 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.341492 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.341534 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.341577 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.341619 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.341665 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.341662 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.341714 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.341802 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.341856 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.341906 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.341957 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.341955 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.342002 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.342014 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.342054 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.342081 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.342103 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.342149 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.342197 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.342244 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.342289 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.342334 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.342377 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.342424 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.342492 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.342537 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.342582 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.342632 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.342679 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.342724 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.342814 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.342868 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.342915 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.342966 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.343261 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.343339 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.343393 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.343441 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.342181 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.342448 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.342420 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.343211 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.344866 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.345632 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.345677 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.345843 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.345881 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.346242 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.346282 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.346315 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.346661 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.346708 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.346742 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.346802 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.346838 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.346870 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.346907 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.346940 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.346974 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347009 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347042 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347073 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347106 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347142 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347175 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347209 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347241 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347281 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347319 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347366 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347421 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347464 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347496 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347530 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347567 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347612 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347654 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347696 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347733 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347803 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347834 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347872 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347910 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347954 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347995 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.348029 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.348060 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.348092 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.348125 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.348157 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.348189 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.348223 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.348339 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.348525 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.348593 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.348744 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.348824 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.348859 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.348893 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349044 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349082 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349118 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349151 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349185 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349217 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349248 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349285 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349320 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349353 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349386 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349418 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349518 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349563 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349605 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349643 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349684 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349718 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349752 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349812 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349845 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349878 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349913 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349948 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349983 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.350018 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.350066 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.350115 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.350171 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.350221 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.350271 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.350320 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.350473 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.343447 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.343634 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.343908 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.343945 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.350962 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.343951 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.344037 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.344057 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.344349 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.344240 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.344587 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.344612 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.344817 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.344842 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.345074 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.345243 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.345357 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.345918 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.346988 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347095 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347442 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347452 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347659 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.347931 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.348270 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.348323 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.348328 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.348629 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349077 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349265 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349615 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.349883 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.351479 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.351955 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.352178 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.352240 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.352266 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.352529 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.352781 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.353046 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.353351 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.353409 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.353481 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.353840 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.354008 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.354259 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.354335 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.354453 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.354527 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.354991 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.355475 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.355577 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.355576 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.350949 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.355829 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.355842 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.355928 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.356042 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.356081 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.356120 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.356155 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.356194 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.356289 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.356306 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.356364 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.356387 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.356606 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.356674 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.356741 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.356855 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.356909 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.356967 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.357024 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.357080 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.357153 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.357209 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.357263 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.357318 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.357371 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.357425 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.357478 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.357539 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.357593 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.357645 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.357701 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.357793 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.357853 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.357972 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.358032 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.358088 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.358156 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.358210 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.358275 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.358339 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.358403 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.358461 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.358514 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.358574 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.358627 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.358678 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.358733 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.356446 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.356546 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.361104 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.356613 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.356952 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.357214 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.357253 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.357534 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.357652 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.357809 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.358337 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.358662 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.358698 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.358799 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:05:43.85875194 +0000 UTC m=+21.402007686 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.358842 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.359033 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.359108 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.359117 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.359304 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.360445 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.360501 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.360534 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.361070 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.361581 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.361791 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.362010 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.362348 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.362406 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.362466 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.363050 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.363150 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.363218 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.363282 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.363255 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.363869 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.363915 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.363940 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.364746 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.365648 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.365687 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.365716 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.365750 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.365805 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.365831 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.365855 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.365878 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.365900 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.365923 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.365947 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.365970 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.365991 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366069 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366102 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366128 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366177 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366204 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366233 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366261 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366291 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366315 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366339 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366367 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366394 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366432 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366458 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366634 4848 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366652 4848 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366669 4848 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366683 4848 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366698 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366712 4848 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366726 4848 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366739 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366752 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366793 4848 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366806 4848 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366819 4848 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366862 4848 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366875 4848 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366888 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366901 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.366913 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367033 4848 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367049 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367061 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367074 4848 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367086 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367098 4848 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367113 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367126 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367139 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367153 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367165 4848 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367176 4848 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367189 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.364242 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.364296 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.364460 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.364501 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.364715 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.364712 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.365013 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.365153 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.365597 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.365823 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.365919 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.362593 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367604 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367630 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367648 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367668 4848 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367688 4848 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367700 4848 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367713 4848 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367725 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367737 4848 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367751 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367781 4848 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367795 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367808 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367821 4848 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367834 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367847 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367858 4848 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367871 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367884 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367896 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367908 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367921 4848 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367933 4848 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367945 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367957 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367968 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367980 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.367992 4848 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368003 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368017 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368029 4848 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368060 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368073 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368085 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368097 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368109 4848 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368120 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368133 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368144 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368156 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368169 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368181 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368193 4848 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368209 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368221 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368234 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368246 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368258 4848 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368269 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368281 4848 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368325 4848 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368337 4848 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368349 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368396 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368409 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368421 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368434 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368445 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368457 4848 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368470 4848 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368483 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368495 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368507 4848 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368518 4848 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368530 4848 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368544 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.368556 4848 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.368659 4848 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.368723 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 09:05:43.868699825 +0000 UTC m=+21.411955481 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.369163 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.369223 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.369326 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.369613 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.369686 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.369786 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.370085 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.370109 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.370191 4848 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.370244 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 09:05:43.870229907 +0000 UTC m=+21.413485563 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.371183 4848 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.371740 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.372021 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.373197 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.373611 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.373816 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.374118 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.374607 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.374889 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.375607 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.376433 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.376956 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.377117 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.377298 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.377688 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.378329 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.378457 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.378467 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.378579 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.379713 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.379880 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.380078 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.381212 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.381400 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.381459 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.381524 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.381911 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.382969 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.383556 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.387262 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.389992 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.391100 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.391874 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.392504 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.392536 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.392550 4848 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.392618 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 09:05:43.892598264 +0000 UTC m=+21.435853920 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.392825 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.393496 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.393663 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.393804 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.397359 4848 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.396929 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.397440 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 09:05:43.897414836 +0000 UTC m=+21.440670612 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.393959 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.394367 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.394510 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.395200 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.395335 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.397507 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.395442 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.396710 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.396807 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.397305 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.397688 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.399096 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.399240 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.399552 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.400844 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.401096 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.401414 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.401715 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.401819 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.402702 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.402839 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.403127 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.403791 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.405619 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.407959 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.409161 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.410881 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.412450 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.413433 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.415195 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.415328 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.416300 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.416431 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.416505 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.416526 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.416572 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.417014 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.416580 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.416845 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.416840 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.417233 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.417316 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.417319 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.417675 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.417742 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.417792 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.417861 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.417985 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.418512 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.418552 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.419445 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.419787 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.419936 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.420580 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.420736 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.421637 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.422206 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.424510 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.427249 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.425499 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.429993 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.431437 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.433300 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.435318 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.436210 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.437199 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.438549 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.439282 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.440050 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.441294 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.442342 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.444166 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.444911 4848 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.445078 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.449270 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.449707 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.450180 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.451032 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.452612 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.454426 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.454802 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.456653 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.457752 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.459811 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.461266 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.461367 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.462362 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.463253 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.464363 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.465385 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.465949 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.466670 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.467864 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.468790 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.469178 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.469348 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.469371 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.469427 4848 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.469442 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.469461 4848 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.469486 4848 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.469504 4848 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.469517 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.469529 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.469543 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.469555 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.469567 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.469645 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.469676 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.469887 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.469896 4848 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.469944 4848 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.469958 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.469969 4848 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.469981 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.469993 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470006 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470017 4848 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470029 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470040 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470052 4848 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470063 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470073 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470085 4848 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470096 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470110 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470137 4848 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470158 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470172 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470186 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470200 4848 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470212 4848 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470224 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470235 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470249 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470261 4848 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470272 4848 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470283 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470295 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470322 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470334 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470346 4848 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470358 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470489 4848 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470518 4848 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470535 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470556 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470569 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470574 4848 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470644 4848 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470657 4848 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470670 4848 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470684 4848 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470697 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470709 4848 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470720 4848 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470732 4848 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470743 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470774 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470786 4848 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470798 4848 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470810 4848 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470821 4848 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470832 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470843 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470854 4848 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470865 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470877 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470889 4848 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470900 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470911 4848 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470922 4848 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470933 4848 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470944 4848 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470955 4848 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470967 4848 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470979 4848 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.470991 4848 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.471001 4848 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.471013 4848 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.471025 4848 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.471037 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.471049 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.471061 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.471072 4848 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.471083 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.471094 4848 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.471113 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.471128 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.471145 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.471163 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.471179 4848 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.471194 4848 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.471208 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.471222 4848 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.471238 4848 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.471253 4848 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.471540 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.472141 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.472776 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.473679 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.476007 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.484915 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.493813 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.498911 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.500669 4848 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7" exitCode=255 Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.500704 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7"} Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.506754 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.514660 4848 scope.go:117] "RemoveContainer" containerID="9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.515913 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.519259 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.531663 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.545549 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.561640 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.578847 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.590323 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.600179 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.602543 4848 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.602602 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.608946 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.623998 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.636819 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.639590 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.653843 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 17 09:05:43 crc kubenswrapper[4848]: W0217 09:05:43.655043 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-2758f2cdc84a39511108723c35edab7f7e67f0e409c6001a5e81569e3bc5c9a5 WatchSource:0}: Error finding container 2758f2cdc84a39511108723c35edab7f7e67f0e409c6001a5e81569e3bc5c9a5: Status 404 returned error can't find the container with id 2758f2cdc84a39511108723c35edab7f7e67f0e409c6001a5e81569e3bc5c9a5 Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.675170 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 17 09:05:43 crc kubenswrapper[4848]: W0217 09:05:43.681507 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-9c1f928a9b7e4a9658cb56b553a737803bca47efcd35d00ad0cdc395ec1acd3b WatchSource:0}: Error finding container 9c1f928a9b7e4a9658cb56b553a737803bca47efcd35d00ad0cdc395ec1acd3b: Status 404 returned error can't find the container with id 9c1f928a9b7e4a9658cb56b553a737803bca47efcd35d00ad0cdc395ec1acd3b Feb 17 09:05:43 crc kubenswrapper[4848]: W0217 09:05:43.690700 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-841a46df22f8362a402464c74ed08734018e773085f2c05643057948c6c49445 WatchSource:0}: Error finding container 841a46df22f8362a402464c74ed08734018e773085f2c05643057948c6c49445: Status 404 returned error can't find the container with id 841a46df22f8362a402464c74ed08734018e773085f2c05643057948c6c49445 Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.874189 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.874284 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.874381 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:05:44.874348485 +0000 UTC m=+22.417604131 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.874411 4848 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.874471 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.874499 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 09:05:44.874477299 +0000 UTC m=+22.417733155 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.874524 4848 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.874562 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 09:05:44.874550971 +0000 UTC m=+22.417806617 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.975721 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:05:43 crc kubenswrapper[4848]: I0217 09:05:43.975803 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.975922 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.975937 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.975949 4848 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.976013 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 09:05:44.975986948 +0000 UTC m=+22.519242594 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.976082 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.976129 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.976154 4848 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:05:43 crc kubenswrapper[4848]: E0217 09:05:43.976237 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 09:05:44.976209984 +0000 UTC m=+22.519465670 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.329562 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 19:55:39.845444509 +0000 UTC Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.505273 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"841a46df22f8362a402464c74ed08734018e773085f2c05643057948c6c49445"} Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.507491 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e"} Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.507538 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538"} Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.507559 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9c1f928a9b7e4a9658cb56b553a737803bca47efcd35d00ad0cdc395ec1acd3b"} Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.509276 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd"} Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.509327 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"2758f2cdc84a39511108723c35edab7f7e67f0e409c6001a5e81569e3bc5c9a5"} Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.513839 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.516809 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247"} Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.517423 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.530023 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.547842 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.566271 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.586805 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.608848 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.631331 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.652205 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.674106 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.689555 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.715620 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.734708 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.754713 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.771902 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.790048 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:44Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.882738 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.882889 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.882933 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:05:44 crc kubenswrapper[4848]: E0217 09:05:44.882964 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:05:46.882926584 +0000 UTC m=+24.426182250 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:05:44 crc kubenswrapper[4848]: E0217 09:05:44.883030 4848 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 09:05:44 crc kubenswrapper[4848]: E0217 09:05:44.883054 4848 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 09:05:44 crc kubenswrapper[4848]: E0217 09:05:44.883140 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 09:05:46.883107389 +0000 UTC m=+24.426363065 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 09:05:44 crc kubenswrapper[4848]: E0217 09:05:44.883164 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 09:05:46.8831529 +0000 UTC m=+24.426408586 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.984195 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:05:44 crc kubenswrapper[4848]: I0217 09:05:44.984294 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:05:44 crc kubenswrapper[4848]: E0217 09:05:44.984424 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 09:05:44 crc kubenswrapper[4848]: E0217 09:05:44.984460 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 09:05:44 crc kubenswrapper[4848]: E0217 09:05:44.984472 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 09:05:44 crc kubenswrapper[4848]: E0217 09:05:44.984487 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 09:05:44 crc kubenswrapper[4848]: E0217 09:05:44.984492 4848 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:05:44 crc kubenswrapper[4848]: E0217 09:05:44.984506 4848 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:05:44 crc kubenswrapper[4848]: E0217 09:05:44.984594 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 09:05:46.984569096 +0000 UTC m=+24.527824782 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:05:44 crc kubenswrapper[4848]: E0217 09:05:44.984625 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 09:05:46.984612737 +0000 UTC m=+24.527868413 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:05:45 crc kubenswrapper[4848]: I0217 09:05:45.329873 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 17:42:09.396931993 +0000 UTC Feb 17 09:05:45 crc kubenswrapper[4848]: I0217 09:05:45.382606 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:05:45 crc kubenswrapper[4848]: I0217 09:05:45.382659 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:05:45 crc kubenswrapper[4848]: I0217 09:05:45.382612 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:05:45 crc kubenswrapper[4848]: E0217 09:05:45.382789 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:05:45 crc kubenswrapper[4848]: E0217 09:05:45.382902 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:05:45 crc kubenswrapper[4848]: E0217 09:05:45.382981 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:05:45 crc kubenswrapper[4848]: I0217 09:05:45.387927 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 17 09:05:45 crc kubenswrapper[4848]: I0217 09:05:45.388705 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 17 09:05:45 crc kubenswrapper[4848]: I0217 09:05:45.390055 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 17 09:05:45 crc kubenswrapper[4848]: I0217 09:05:45.390831 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 17 09:05:45 crc kubenswrapper[4848]: I0217 09:05:45.392022 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 17 09:05:45 crc kubenswrapper[4848]: I0217 09:05:45.392730 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 17 09:05:45 crc kubenswrapper[4848]: I0217 09:05:45.394031 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 17 09:05:45 crc kubenswrapper[4848]: I0217 09:05:45.394647 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 17 09:05:45 crc kubenswrapper[4848]: I0217 09:05:45.395301 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 17 09:05:45 crc kubenswrapper[4848]: I0217 09:05:45.396268 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 17 09:05:45 crc kubenswrapper[4848]: I0217 09:05:45.397036 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 17 09:05:46 crc kubenswrapper[4848]: I0217 09:05:46.329985 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 00:42:46.678145512 +0000 UTC Feb 17 09:05:46 crc kubenswrapper[4848]: I0217 09:05:46.902724 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:05:46 crc kubenswrapper[4848]: E0217 09:05:46.902827 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:05:50.902775194 +0000 UTC m=+28.446030850 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:05:46 crc kubenswrapper[4848]: I0217 09:05:46.902875 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:05:46 crc kubenswrapper[4848]: I0217 09:05:46.902942 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:05:46 crc kubenswrapper[4848]: E0217 09:05:46.903031 4848 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 09:05:46 crc kubenswrapper[4848]: E0217 09:05:46.903077 4848 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 09:05:46 crc kubenswrapper[4848]: E0217 09:05:46.903081 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 09:05:50.903070632 +0000 UTC m=+28.446326278 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 09:05:46 crc kubenswrapper[4848]: E0217 09:05:46.903150 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 09:05:50.903133794 +0000 UTC m=+28.446389440 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 09:05:46 crc kubenswrapper[4848]: I0217 09:05:46.936811 4848 csr.go:261] certificate signing request csr-lwwm5 is approved, waiting to be issued Feb 17 09:05:46 crc kubenswrapper[4848]: I0217 09:05:46.963321 4848 csr.go:257] certificate signing request csr-lwwm5 is issued Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.003689 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.003914 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:05:47 crc kubenswrapper[4848]: E0217 09:05:47.003849 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 09:05:47 crc kubenswrapper[4848]: E0217 09:05:47.003964 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 09:05:47 crc kubenswrapper[4848]: E0217 09:05:47.003977 4848 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:05:47 crc kubenswrapper[4848]: E0217 09:05:47.004027 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 09:05:51.004010455 +0000 UTC m=+28.547266101 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:05:47 crc kubenswrapper[4848]: E0217 09:05:47.004098 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 09:05:47 crc kubenswrapper[4848]: E0217 09:05:47.004120 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 09:05:47 crc kubenswrapper[4848]: E0217 09:05:47.004131 4848 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:05:47 crc kubenswrapper[4848]: E0217 09:05:47.004170 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 09:05:51.00415866 +0000 UTC m=+28.547414306 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.028060 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-gv2xs"] Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.028362 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gv2xs" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.029901 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.030003 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.030324 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.030515 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.043367 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.056481 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.068065 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.079646 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.090171 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.103001 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.104891 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5526fe6c-c04c-4ef2-a482-be066235c702-serviceca\") pod \"node-ca-gv2xs\" (UID: \"5526fe6c-c04c-4ef2-a482-be066235c702\") " pod="openshift-image-registry/node-ca-gv2xs" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.104973 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5526fe6c-c04c-4ef2-a482-be066235c702-host\") pod \"node-ca-gv2xs\" (UID: \"5526fe6c-c04c-4ef2-a482-be066235c702\") " pod="openshift-image-registry/node-ca-gv2xs" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.104996 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8lb7\" (UniqueName: \"kubernetes.io/projected/5526fe6c-c04c-4ef2-a482-be066235c702-kube-api-access-z8lb7\") pod \"node-ca-gv2xs\" (UID: \"5526fe6c-c04c-4ef2-a482-be066235c702\") " pod="openshift-image-registry/node-ca-gv2xs" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.114429 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.126907 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.206376 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5526fe6c-c04c-4ef2-a482-be066235c702-serviceca\") pod \"node-ca-gv2xs\" (UID: \"5526fe6c-c04c-4ef2-a482-be066235c702\") " pod="openshift-image-registry/node-ca-gv2xs" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.206422 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5526fe6c-c04c-4ef2-a482-be066235c702-host\") pod \"node-ca-gv2xs\" (UID: \"5526fe6c-c04c-4ef2-a482-be066235c702\") " pod="openshift-image-registry/node-ca-gv2xs" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.206439 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8lb7\" (UniqueName: \"kubernetes.io/projected/5526fe6c-c04c-4ef2-a482-be066235c702-kube-api-access-z8lb7\") pod \"node-ca-gv2xs\" (UID: \"5526fe6c-c04c-4ef2-a482-be066235c702\") " pod="openshift-image-registry/node-ca-gv2xs" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.206505 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5526fe6c-c04c-4ef2-a482-be066235c702-host\") pod \"node-ca-gv2xs\" (UID: \"5526fe6c-c04c-4ef2-a482-be066235c702\") " pod="openshift-image-registry/node-ca-gv2xs" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.208961 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5526fe6c-c04c-4ef2-a482-be066235c702-serviceca\") pod \"node-ca-gv2xs\" (UID: \"5526fe6c-c04c-4ef2-a482-be066235c702\") " pod="openshift-image-registry/node-ca-gv2xs" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.226511 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8lb7\" (UniqueName: \"kubernetes.io/projected/5526fe6c-c04c-4ef2-a482-be066235c702-kube-api-access-z8lb7\") pod \"node-ca-gv2xs\" (UID: \"5526fe6c-c04c-4ef2-a482-be066235c702\") " pod="openshift-image-registry/node-ca-gv2xs" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.330564 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 07:29:43.723843845 +0000 UTC Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.338892 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-gv2xs" Feb 17 09:05:47 crc kubenswrapper[4848]: W0217 09:05:47.359611 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5526fe6c_c04c_4ef2_a482_be066235c702.slice/crio-09ac1b4595c2a82079555a02b7a382d5ac0518bcbf8f6c5a757f9f17f1adc241 WatchSource:0}: Error finding container 09ac1b4595c2a82079555a02b7a382d5ac0518bcbf8f6c5a757f9f17f1adc241: Status 404 returned error can't find the container with id 09ac1b4595c2a82079555a02b7a382d5ac0518bcbf8f6c5a757f9f17f1adc241 Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.383346 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.383388 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.383435 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:05:47 crc kubenswrapper[4848]: E0217 09:05:47.383491 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:05:47 crc kubenswrapper[4848]: E0217 09:05:47.383556 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:05:47 crc kubenswrapper[4848]: E0217 09:05:47.383610 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.420725 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-fn2gp"] Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.421108 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fn2gp" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.424262 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.424427 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.424620 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-stvnz"] Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.425047 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.425523 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.427230 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.427469 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.427687 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.427751 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.429372 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.440407 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.453206 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.470653 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.482333 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.495768 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.509435 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5fb23008-bf50-4af1-812e-b8fa98dda9bf-hosts-file\") pod \"node-resolver-fn2gp\" (UID: \"5fb23008-bf50-4af1-812e-b8fa98dda9bf\") " pod="openshift-dns/node-resolver-fn2gp" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.509477 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7nzp\" (UniqueName: \"kubernetes.io/projected/7c28fed4-873d-42f6-ae63-03d12a425d0a-kube-api-access-m7nzp\") pod \"machine-config-daemon-stvnz\" (UID: \"7c28fed4-873d-42f6-ae63-03d12a425d0a\") " pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.509504 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c28fed4-873d-42f6-ae63-03d12a425d0a-proxy-tls\") pod \"machine-config-daemon-stvnz\" (UID: \"7c28fed4-873d-42f6-ae63-03d12a425d0a\") " pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.509527 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7c28fed4-873d-42f6-ae63-03d12a425d0a-mcd-auth-proxy-config\") pod \"machine-config-daemon-stvnz\" (UID: \"7c28fed4-873d-42f6-ae63-03d12a425d0a\") " pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.509642 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7c28fed4-873d-42f6-ae63-03d12a425d0a-rootfs\") pod \"machine-config-daemon-stvnz\" (UID: \"7c28fed4-873d-42f6-ae63-03d12a425d0a\") " pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.509675 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hpm5\" (UniqueName: \"kubernetes.io/projected/5fb23008-bf50-4af1-812e-b8fa98dda9bf-kube-api-access-5hpm5\") pod \"node-resolver-fn2gp\" (UID: \"5fb23008-bf50-4af1-812e-b8fa98dda9bf\") " pod="openshift-dns/node-resolver-fn2gp" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.511307 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.522927 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.525061 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb"} Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.525834 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gv2xs" event={"ID":"5526fe6c-c04c-4ef2-a482-be066235c702","Type":"ContainerStarted","Data":"09ac1b4595c2a82079555a02b7a382d5ac0518bcbf8f6c5a757f9f17f1adc241"} Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.533417 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.542717 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.568520 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.582665 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.596546 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.607690 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.610131 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5fb23008-bf50-4af1-812e-b8fa98dda9bf-hosts-file\") pod \"node-resolver-fn2gp\" (UID: \"5fb23008-bf50-4af1-812e-b8fa98dda9bf\") " pod="openshift-dns/node-resolver-fn2gp" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.610180 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7nzp\" (UniqueName: \"kubernetes.io/projected/7c28fed4-873d-42f6-ae63-03d12a425d0a-kube-api-access-m7nzp\") pod \"machine-config-daemon-stvnz\" (UID: \"7c28fed4-873d-42f6-ae63-03d12a425d0a\") " pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.610219 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c28fed4-873d-42f6-ae63-03d12a425d0a-proxy-tls\") pod \"machine-config-daemon-stvnz\" (UID: \"7c28fed4-873d-42f6-ae63-03d12a425d0a\") " pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.610264 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7c28fed4-873d-42f6-ae63-03d12a425d0a-mcd-auth-proxy-config\") pod \"machine-config-daemon-stvnz\" (UID: \"7c28fed4-873d-42f6-ae63-03d12a425d0a\") " pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.610291 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hpm5\" (UniqueName: \"kubernetes.io/projected/5fb23008-bf50-4af1-812e-b8fa98dda9bf-kube-api-access-5hpm5\") pod \"node-resolver-fn2gp\" (UID: \"5fb23008-bf50-4af1-812e-b8fa98dda9bf\") " pod="openshift-dns/node-resolver-fn2gp" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.610314 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7c28fed4-873d-42f6-ae63-03d12a425d0a-rootfs\") pod \"machine-config-daemon-stvnz\" (UID: \"7c28fed4-873d-42f6-ae63-03d12a425d0a\") " pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.610375 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/7c28fed4-873d-42f6-ae63-03d12a425d0a-rootfs\") pod \"machine-config-daemon-stvnz\" (UID: \"7c28fed4-873d-42f6-ae63-03d12a425d0a\") " pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.610534 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/5fb23008-bf50-4af1-812e-b8fa98dda9bf-hosts-file\") pod \"node-resolver-fn2gp\" (UID: \"5fb23008-bf50-4af1-812e-b8fa98dda9bf\") " pod="openshift-dns/node-resolver-fn2gp" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.611218 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7c28fed4-873d-42f6-ae63-03d12a425d0a-mcd-auth-proxy-config\") pod \"machine-config-daemon-stvnz\" (UID: \"7c28fed4-873d-42f6-ae63-03d12a425d0a\") " pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.615343 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7c28fed4-873d-42f6-ae63-03d12a425d0a-proxy-tls\") pod \"machine-config-daemon-stvnz\" (UID: \"7c28fed4-873d-42f6-ae63-03d12a425d0a\") " pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.621209 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.624105 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hpm5\" (UniqueName: \"kubernetes.io/projected/5fb23008-bf50-4af1-812e-b8fa98dda9bf-kube-api-access-5hpm5\") pod \"node-resolver-fn2gp\" (UID: \"5fb23008-bf50-4af1-812e-b8fa98dda9bf\") " pod="openshift-dns/node-resolver-fn2gp" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.629585 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7nzp\" (UniqueName: \"kubernetes.io/projected/7c28fed4-873d-42f6-ae63-03d12a425d0a-kube-api-access-m7nzp\") pod \"machine-config-daemon-stvnz\" (UID: \"7c28fed4-873d-42f6-ae63-03d12a425d0a\") " pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.632336 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.643085 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.655665 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.667685 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.676723 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.733099 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-fn2gp" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.739407 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:05:47 crc kubenswrapper[4848]: W0217 09:05:47.746193 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fb23008_bf50_4af1_812e_b8fa98dda9bf.slice/crio-a6d78fe18cec1bf0cfce499b614d40cf13bb7d44bace1ce740a3e41c7df6c8d7 WatchSource:0}: Error finding container a6d78fe18cec1bf0cfce499b614d40cf13bb7d44bace1ce740a3e41c7df6c8d7: Status 404 returned error can't find the container with id a6d78fe18cec1bf0cfce499b614d40cf13bb7d44bace1ce740a3e41c7df6c8d7 Feb 17 09:05:47 crc kubenswrapper[4848]: W0217 09:05:47.768091 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c28fed4_873d_42f6_ae63_03d12a425d0a.slice/crio-e6aee7951cc293d72d783f9813a19925232e2eeb39367087298f22c8ca8b998c WatchSource:0}: Error finding container e6aee7951cc293d72d783f9813a19925232e2eeb39367087298f22c8ca8b998c: Status 404 returned error can't find the container with id e6aee7951cc293d72d783f9813a19925232e2eeb39367087298f22c8ca8b998c Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.818943 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4fvgf"] Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.819673 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:47 crc kubenswrapper[4848]: W0217 09:05:47.827855 4848 reflector.go:561] object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert": failed to list *v1.Secret: secrets "ovn-node-metrics-cert" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-ovn-kubernetes": no relationship found between node 'crc' and this object Feb 17 09:05:47 crc kubenswrapper[4848]: E0217 09:05:47.827893 4848 reflector.go:158] "Unhandled Error" err="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"ovn-node-metrics-cert\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-ovn-kubernetes\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.827953 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.828181 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.828277 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.828387 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.828483 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.828578 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.828896 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-t94zv"] Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.829403 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-6rgmx"] Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.829581 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6rgmx" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.829830 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-t94zv" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.832712 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.833017 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.833903 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.838286 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.838311 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.838492 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.838540 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.850114 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.890526 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.912528 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3244ef77-7b63-45b9-9b12-2b12cb6654df-system-cni-dir\") pod \"multus-additional-cni-plugins-t94zv\" (UID: \"3244ef77-7b63-45b9-9b12-2b12cb6654df\") " pod="openshift-multus/multus-additional-cni-plugins-t94zv" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.912571 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-run-openvswitch\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.912590 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-ovn-node-metrics-cert\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.912605 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-multus-socket-dir-parent\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.912623 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3244ef77-7b63-45b9-9b12-2b12cb6654df-cni-binary-copy\") pod \"multus-additional-cni-plugins-t94zv\" (UID: \"3244ef77-7b63-45b9-9b12-2b12cb6654df\") " pod="openshift-multus/multus-additional-cni-plugins-t94zv" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.912639 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-host-run-k8s-cni-cncf-io\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.912685 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-var-lib-openvswitch\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.912719 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-etc-kubernetes\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.912738 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-run-systemd\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.912769 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-run-netns\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.912783 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-ovnkube-script-lib\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.912823 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3244ef77-7b63-45b9-9b12-2b12cb6654df-cnibin\") pod \"multus-additional-cni-plugins-t94zv\" (UID: \"3244ef77-7b63-45b9-9b12-2b12cb6654df\") " pod="openshift-multus/multus-additional-cni-plugins-t94zv" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.912844 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-multus-conf-dir\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.912860 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-cni-binary-copy\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.912874 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-systemd-units\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.912888 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3244ef77-7b63-45b9-9b12-2b12cb6654df-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t94zv\" (UID: \"3244ef77-7b63-45b9-9b12-2b12cb6654df\") " pod="openshift-multus/multus-additional-cni-plugins-t94zv" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.912902 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-multus-cni-dir\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.912922 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3244ef77-7b63-45b9-9b12-2b12cb6654df-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t94zv\" (UID: \"3244ef77-7b63-45b9-9b12-2b12cb6654df\") " pod="openshift-multus/multus-additional-cni-plugins-t94zv" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.912936 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-node-log\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.912949 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-run-ovn-kubernetes\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.912972 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-system-cni-dir\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.912986 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-cnibin\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.912999 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-multus-daemon-config\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.913013 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-host-run-multus-certs\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.913027 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-host-var-lib-cni-bin\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.913040 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-etc-openvswitch\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.913059 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.913079 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-slash\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.913094 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-ovnkube-config\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.913139 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-hostroot\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.913155 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-os-release\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.913167 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-host-run-netns\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.913182 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-host-var-lib-kubelet\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.913196 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3244ef77-7b63-45b9-9b12-2b12cb6654df-os-release\") pod \"multus-additional-cni-plugins-t94zv\" (UID: \"3244ef77-7b63-45b9-9b12-2b12cb6654df\") " pod="openshift-multus/multus-additional-cni-plugins-t94zv" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.913210 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-kubelet\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.913222 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwttz\" (UniqueName: \"kubernetes.io/projected/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-kube-api-access-lwttz\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.913241 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-cni-netd\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.913270 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-env-overrides\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.913284 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-host-var-lib-cni-multus\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.913299 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-225sp\" (UniqueName: \"kubernetes.io/projected/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-kube-api-access-225sp\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.913313 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-run-ovn\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.913326 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-log-socket\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.913339 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-cni-bin\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.913353 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nngkk\" (UniqueName: \"kubernetes.io/projected/3244ef77-7b63-45b9-9b12-2b12cb6654df-kube-api-access-nngkk\") pod \"multus-additional-cni-plugins-t94zv\" (UID: \"3244ef77-7b63-45b9-9b12-2b12cb6654df\") " pod="openshift-multus/multus-additional-cni-plugins-t94zv" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.920666 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.964767 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.964844 4848 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-17 09:00:46 +0000 UTC, rotation deadline is 2026-12-17 23:39:03.954974399 +0000 UTC Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.964887 4848 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7286h33m15.990089892s for next certificate rotation Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.981475 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:47 crc kubenswrapper[4848]: I0217 09:05:47.991867 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:47Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.003513 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014282 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-run-netns\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014313 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-run-systemd\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014338 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-ovnkube-script-lib\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014325 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014363 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3244ef77-7b63-45b9-9b12-2b12cb6654df-cnibin\") pod \"multus-additional-cni-plugins-t94zv\" (UID: \"3244ef77-7b63-45b9-9b12-2b12cb6654df\") " pod="openshift-multus/multus-additional-cni-plugins-t94zv" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014379 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-cni-binary-copy\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014392 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-multus-conf-dir\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014407 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-systemd-units\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014427 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3244ef77-7b63-45b9-9b12-2b12cb6654df-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t94zv\" (UID: \"3244ef77-7b63-45b9-9b12-2b12cb6654df\") " pod="openshift-multus/multus-additional-cni-plugins-t94zv" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014429 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-run-netns\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014444 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-multus-cni-dir\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014505 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-node-log\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014523 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-run-ovn-kubernetes\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014527 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-multus-conf-dir\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014540 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-system-cni-dir\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014564 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3244ef77-7b63-45b9-9b12-2b12cb6654df-cnibin\") pod \"multus-additional-cni-plugins-t94zv\" (UID: \"3244ef77-7b63-45b9-9b12-2b12cb6654df\") " pod="openshift-multus/multus-additional-cni-plugins-t94zv" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014565 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3244ef77-7b63-45b9-9b12-2b12cb6654df-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t94zv\" (UID: \"3244ef77-7b63-45b9-9b12-2b12cb6654df\") " pod="openshift-multus/multus-additional-cni-plugins-t94zv" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014584 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-cnibin\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014598 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-multus-daemon-config\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014612 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-host-run-multus-certs\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014626 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-etc-openvswitch\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014640 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014652 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-multus-cni-dir\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014657 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-host-var-lib-cni-bin\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014672 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-ovnkube-config\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014686 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-hostroot\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014688 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-systemd-units\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014767 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-slash\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014784 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-host-var-lib-kubelet\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014799 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3244ef77-7b63-45b9-9b12-2b12cb6654df-os-release\") pod \"multus-additional-cni-plugins-t94zv\" (UID: \"3244ef77-7b63-45b9-9b12-2b12cb6654df\") " pod="openshift-multus/multus-additional-cni-plugins-t94zv" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014813 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-kubelet\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014826 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwttz\" (UniqueName: \"kubernetes.io/projected/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-kube-api-access-lwttz\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014839 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-os-release\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014852 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-host-run-netns\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014867 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-cni-netd\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014880 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-host-var-lib-cni-multus\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014894 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-225sp\" (UniqueName: \"kubernetes.io/projected/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-kube-api-access-225sp\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014919 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-env-overrides\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014935 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-cni-bin\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014951 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nngkk\" (UniqueName: \"kubernetes.io/projected/3244ef77-7b63-45b9-9b12-2b12cb6654df-kube-api-access-nngkk\") pod \"multus-additional-cni-plugins-t94zv\" (UID: \"3244ef77-7b63-45b9-9b12-2b12cb6654df\") " pod="openshift-multus/multus-additional-cni-plugins-t94zv" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014965 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-run-ovn\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014979 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-log-socket\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014994 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-run-openvswitch\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015009 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-ovn-node-metrics-cert\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015024 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-multus-socket-dir-parent\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015043 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3244ef77-7b63-45b9-9b12-2b12cb6654df-system-cni-dir\") pod \"multus-additional-cni-plugins-t94zv\" (UID: \"3244ef77-7b63-45b9-9b12-2b12cb6654df\") " pod="openshift-multus/multus-additional-cni-plugins-t94zv" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015058 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3244ef77-7b63-45b9-9b12-2b12cb6654df-cni-binary-copy\") pod \"multus-additional-cni-plugins-t94zv\" (UID: \"3244ef77-7b63-45b9-9b12-2b12cb6654df\") " pod="openshift-multus/multus-additional-cni-plugins-t94zv" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015073 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-host-run-k8s-cni-cncf-io\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015088 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-var-lib-openvswitch\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015093 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-ovnkube-script-lib\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015125 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-etc-kubernetes\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015129 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-cni-binary-copy\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015104 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-etc-kubernetes\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015206 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-run-ovn\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015230 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-host-run-netns\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015252 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-cni-netd\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015267 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3244ef77-7b63-45b9-9b12-2b12cb6654df-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-t94zv\" (UID: \"3244ef77-7b63-45b9-9b12-2b12cb6654df\") " pod="openshift-multus/multus-additional-cni-plugins-t94zv" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015302 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-log-socket\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015274 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-host-var-lib-cni-multus\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015310 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-run-openvswitch\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015319 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-os-release\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015340 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-node-log\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.014504 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-run-systemd\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015349 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-cni-bin\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015372 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015378 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-run-ovn-kubernetes\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015372 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3244ef77-7b63-45b9-9b12-2b12cb6654df-system-cni-dir\") pod \"multus-additional-cni-plugins-t94zv\" (UID: \"3244ef77-7b63-45b9-9b12-2b12cb6654df\") " pod="openshift-multus/multus-additional-cni-plugins-t94zv" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015418 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-hostroot\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015422 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-host-var-lib-kubelet\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015431 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-slash\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015450 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-kubelet\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015459 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3244ef77-7b63-45b9-9b12-2b12cb6654df-os-release\") pod \"multus-additional-cni-plugins-t94zv\" (UID: \"3244ef77-7b63-45b9-9b12-2b12cb6654df\") " pod="openshift-multus/multus-additional-cni-plugins-t94zv" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015480 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-system-cni-dir\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015481 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-host-run-multus-certs\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015506 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-host-run-k8s-cni-cncf-io\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015511 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-cnibin\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015522 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-var-lib-openvswitch\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015534 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-multus-socket-dir-parent\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015818 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3244ef77-7b63-45b9-9b12-2b12cb6654df-tuning-conf-dir\") pod \"multus-additional-cni-plugins-t94zv\" (UID: \"3244ef77-7b63-45b9-9b12-2b12cb6654df\") " pod="openshift-multus/multus-additional-cni-plugins-t94zv" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015852 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3244ef77-7b63-45b9-9b12-2b12cb6654df-cni-binary-copy\") pod \"multus-additional-cni-plugins-t94zv\" (UID: \"3244ef77-7b63-45b9-9b12-2b12cb6654df\") " pod="openshift-multus/multus-additional-cni-plugins-t94zv" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015881 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-env-overrides\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015892 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-etc-openvswitch\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015913 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-host-var-lib-cni-bin\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.015921 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-ovnkube-config\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.016317 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-multus-daemon-config\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.025685 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.030231 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-225sp\" (UniqueName: \"kubernetes.io/projected/ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6-kube-api-access-225sp\") pod \"multus-6rgmx\" (UID: \"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\") " pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.030838 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nngkk\" (UniqueName: \"kubernetes.io/projected/3244ef77-7b63-45b9-9b12-2b12cb6654df-kube-api-access-nngkk\") pod \"multus-additional-cni-plugins-t94zv\" (UID: \"3244ef77-7b63-45b9-9b12-2b12cb6654df\") " pod="openshift-multus/multus-additional-cni-plugins-t94zv" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.031322 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwttz\" (UniqueName: \"kubernetes.io/projected/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-kube-api-access-lwttz\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.035835 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.052997 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.065155 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.085666 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.097481 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.112181 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.125202 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.135463 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.149191 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.154258 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-6rgmx" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.160903 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: W0217 09:05:48.163354 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddcd58be_cbc2_4c49_b9fd_75e8d53e6ce6.slice/crio-5111ffcb39bbe7ff11d9b4c864126a68d23eb58fbc508b47dd39865102fd2d21 WatchSource:0}: Error finding container 5111ffcb39bbe7ff11d9b4c864126a68d23eb58fbc508b47dd39865102fd2d21: Status 404 returned error can't find the container with id 5111ffcb39bbe7ff11d9b4c864126a68d23eb58fbc508b47dd39865102fd2d21 Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.166398 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-t94zv" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.170304 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: W0217 09:05:48.178344 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3244ef77_7b63_45b9_9b12_2b12cb6654df.slice/crio-5f24ef588c501bcf50244480255614c81ea55d39c792b60171cc348b1fb6a992 WatchSource:0}: Error finding container 5f24ef588c501bcf50244480255614c81ea55d39c792b60171cc348b1fb6a992: Status 404 returned error can't find the container with id 5f24ef588c501bcf50244480255614c81ea55d39c792b60171cc348b1fb6a992 Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.187686 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.202193 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.216239 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.225944 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.331086 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 00:56:22.225303086 +0000 UTC Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.531176 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" event={"ID":"3244ef77-7b63-45b9-9b12-2b12cb6654df","Type":"ContainerStarted","Data":"52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2"} Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.531223 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" event={"ID":"3244ef77-7b63-45b9-9b12-2b12cb6654df","Type":"ContainerStarted","Data":"5f24ef588c501bcf50244480255614c81ea55d39c792b60171cc348b1fb6a992"} Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.532373 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fn2gp" event={"ID":"5fb23008-bf50-4af1-812e-b8fa98dda9bf","Type":"ContainerStarted","Data":"a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0"} Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.532397 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-fn2gp" event={"ID":"5fb23008-bf50-4af1-812e-b8fa98dda9bf","Type":"ContainerStarted","Data":"a6d78fe18cec1bf0cfce499b614d40cf13bb7d44bace1ce740a3e41c7df6c8d7"} Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.533483 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6rgmx" event={"ID":"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6","Type":"ContainerStarted","Data":"294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db"} Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.533507 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6rgmx" event={"ID":"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6","Type":"ContainerStarted","Data":"5111ffcb39bbe7ff11d9b4c864126a68d23eb58fbc508b47dd39865102fd2d21"} Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.535485 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerStarted","Data":"250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e"} Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.535509 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerStarted","Data":"d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8"} Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.535518 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerStarted","Data":"e6aee7951cc293d72d783f9813a19925232e2eeb39367087298f22c8ca8b998c"} Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.537245 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-gv2xs" event={"ID":"5526fe6c-c04c-4ef2-a482-be066235c702","Type":"ContainerStarted","Data":"5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4"} Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.549063 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.559870 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.573868 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.586168 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.597675 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.611937 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.623698 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.633203 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.644791 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.656533 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.666938 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.685146 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.699830 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.710926 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.734241 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.748629 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.759702 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.770619 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.790562 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.805870 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.817588 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.830969 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.850755 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.868882 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.884520 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.901570 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:48Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.914874 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 17 09:05:48 crc kubenswrapper[4848]: I0217 09:05:48.919503 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-ovn-node-metrics-cert\") pod \"ovnkube-node-4fvgf\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.032225 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:49 crc kubenswrapper[4848]: W0217 09:05:49.047477 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f1c8d8b_192e_4f7f_a78a_0ed9b92e80e8.slice/crio-c9750d1add3ad0fb790a8bb1dc39fcd028bf1a3fe185f34e473c0dc886886286 WatchSource:0}: Error finding container c9750d1add3ad0fb790a8bb1dc39fcd028bf1a3fe185f34e473c0dc886886286: Status 404 returned error can't find the container with id c9750d1add3ad0fb790a8bb1dc39fcd028bf1a3fe185f34e473c0dc886886286 Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.238712 4848 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.240688 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.240722 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.240734 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.240858 4848 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.250739 4848 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.251052 4848 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.252246 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.252287 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.252299 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.252313 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.252325 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:49Z","lastTransitionTime":"2026-02-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:49 crc kubenswrapper[4848]: E0217 09:05:49.267691 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.271015 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.271039 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.271046 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.271059 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.271067 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:49Z","lastTransitionTime":"2026-02-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:49 crc kubenswrapper[4848]: E0217 09:05:49.285893 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.289162 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.289194 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.289203 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.289217 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.289225 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:49Z","lastTransitionTime":"2026-02-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:49 crc kubenswrapper[4848]: E0217 09:05:49.298903 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.301871 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.301913 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.301922 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.301938 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.301948 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:49Z","lastTransitionTime":"2026-02-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:49 crc kubenswrapper[4848]: E0217 09:05:49.312977 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.315952 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.315977 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.315985 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.315997 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.316007 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:49Z","lastTransitionTime":"2026-02-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.331789 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 20:18:37.511375653 +0000 UTC Feb 17 09:05:49 crc kubenswrapper[4848]: E0217 09:05:49.332725 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: E0217 09:05:49.332990 4848 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.334866 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.334903 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.334915 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.334932 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.334943 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:49Z","lastTransitionTime":"2026-02-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.382670 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.382701 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.382711 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:05:49 crc kubenswrapper[4848]: E0217 09:05:49.382845 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:05:49 crc kubenswrapper[4848]: E0217 09:05:49.382963 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:05:49 crc kubenswrapper[4848]: E0217 09:05:49.383077 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.436963 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.437194 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.437203 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.437215 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.437224 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:49Z","lastTransitionTime":"2026-02-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.539037 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.539113 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.539139 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.539169 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.539187 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:49Z","lastTransitionTime":"2026-02-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.545902 4848 generic.go:334] "Generic (PLEG): container finished" podID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerID="32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16" exitCode=0 Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.545940 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerDied","Data":"32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16"} Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.546002 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerStarted","Data":"c9750d1add3ad0fb790a8bb1dc39fcd028bf1a3fe185f34e473c0dc886886286"} Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.547823 4848 generic.go:334] "Generic (PLEG): container finished" podID="3244ef77-7b63-45b9-9b12-2b12cb6654df" containerID="52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2" exitCode=0 Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.548148 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" event={"ID":"3244ef77-7b63-45b9-9b12-2b12cb6654df","Type":"ContainerDied","Data":"52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2"} Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.560021 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.576891 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.588904 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.612189 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.626796 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.638828 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.648570 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.648611 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.648619 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.648660 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.648670 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:49Z","lastTransitionTime":"2026-02-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.653242 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.669309 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.689041 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.709132 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.727227 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.739709 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.749247 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.751106 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.751140 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.751151 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.751169 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.751182 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:49Z","lastTransitionTime":"2026-02-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.761407 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.779718 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.796371 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.807611 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.853598 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.853636 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.853646 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.853663 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.853674 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:49Z","lastTransitionTime":"2026-02-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.875991 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.893850 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.909607 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.920610 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.931931 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.944885 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.955394 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.955419 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.955427 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.955439 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.955447 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:49Z","lastTransitionTime":"2026-02-17T09:05:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.955501 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.965471 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:49 crc kubenswrapper[4848]: I0217 09:05:49.978949 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:49Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.058016 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.058062 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.058078 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.058100 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.058115 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:50Z","lastTransitionTime":"2026-02-17T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.159984 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.160018 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.160027 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.160040 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.160049 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:50Z","lastTransitionTime":"2026-02-17T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.262596 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.262641 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.262652 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.262669 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.262681 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:50Z","lastTransitionTime":"2026-02-17T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.333112 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 09:14:26.356723659 +0000 UTC Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.365537 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.365576 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.365586 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.365605 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.365618 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:50Z","lastTransitionTime":"2026-02-17T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.468722 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.468802 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.468819 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.468843 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.468862 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:50Z","lastTransitionTime":"2026-02-17T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.554353 4848 generic.go:334] "Generic (PLEG): container finished" podID="3244ef77-7b63-45b9-9b12-2b12cb6654df" containerID="c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231" exitCode=0 Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.554461 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" event={"ID":"3244ef77-7b63-45b9-9b12-2b12cb6654df","Type":"ContainerDied","Data":"c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231"} Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.560001 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerStarted","Data":"0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3"} Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.560061 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerStarted","Data":"cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6"} Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.560086 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerStarted","Data":"46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b"} Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.560110 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerStarted","Data":"79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4"} Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.560134 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerStarted","Data":"8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b"} Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.560157 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerStarted","Data":"37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7"} Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.571374 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.571424 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.571441 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.571466 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.571483 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:50Z","lastTransitionTime":"2026-02-17T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.575550 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.601057 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.613867 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.620742 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.624984 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.631502 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.648579 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.666972 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.674502 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.674589 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.674615 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.674650 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.674675 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:50Z","lastTransitionTime":"2026-02-17T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.691276 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.715335 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.731372 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.751571 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.765816 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.777615 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.777657 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.777668 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.777686 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.777707 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:50Z","lastTransitionTime":"2026-02-17T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.794266 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.813987 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.831882 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.850177 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.877821 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.880046 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.880083 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.880094 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.880115 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.880128 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:50Z","lastTransitionTime":"2026-02-17T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.893559 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.909004 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.924526 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.937243 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.944311 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.944436 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.944481 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:05:50 crc kubenswrapper[4848]: E0217 09:05:50.944537 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:05:58.94450572 +0000 UTC m=+36.487761376 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:05:50 crc kubenswrapper[4848]: E0217 09:05:50.944554 4848 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 09:05:50 crc kubenswrapper[4848]: E0217 09:05:50.944601 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 09:05:58.944589722 +0000 UTC m=+36.487845368 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 09:05:50 crc kubenswrapper[4848]: E0217 09:05:50.944637 4848 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 09:05:50 crc kubenswrapper[4848]: E0217 09:05:50.944734 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 09:05:58.944709116 +0000 UTC m=+36.487964802 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.950490 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.960530 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.972033 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.982370 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.982434 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.982446 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.982463 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.982475 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:50Z","lastTransitionTime":"2026-02-17T09:05:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.983668 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:50 crc kubenswrapper[4848]: I0217 09:05:50.995379 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:50Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.007740 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.020399 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.032985 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.045293 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.045347 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:05:51 crc kubenswrapper[4848]: E0217 09:05:51.045468 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 09:05:51 crc kubenswrapper[4848]: E0217 09:05:51.045484 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 09:05:51 crc kubenswrapper[4848]: E0217 09:05:51.045496 4848 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:05:51 crc kubenswrapper[4848]: E0217 09:05:51.045534 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 09:05:51 crc kubenswrapper[4848]: E0217 09:05:51.045582 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 09:05:51 crc kubenswrapper[4848]: E0217 09:05:51.045605 4848 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:05:51 crc kubenswrapper[4848]: E0217 09:05:51.045549 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 09:05:59.045534866 +0000 UTC m=+36.588790502 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:05:51 crc kubenswrapper[4848]: E0217 09:05:51.045708 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 09:05:59.04567312 +0000 UTC m=+36.588928806 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.085868 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.085923 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.085941 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.085968 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.085987 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:51Z","lastTransitionTime":"2026-02-17T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.188552 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.188608 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.188626 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.188651 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.188670 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:51Z","lastTransitionTime":"2026-02-17T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.291749 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.291846 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.291869 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.291897 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.291915 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:51Z","lastTransitionTime":"2026-02-17T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.334086 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 15:00:10.878595751 +0000 UTC Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.382856 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.382881 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.383172 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:05:51 crc kubenswrapper[4848]: E0217 09:05:51.383089 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:05:51 crc kubenswrapper[4848]: E0217 09:05:51.383337 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:05:51 crc kubenswrapper[4848]: E0217 09:05:51.383442 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.394198 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.394239 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.394251 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.394269 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.394283 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:51Z","lastTransitionTime":"2026-02-17T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.497445 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.497506 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.497526 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.497545 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.497557 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:51Z","lastTransitionTime":"2026-02-17T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.566882 4848 generic.go:334] "Generic (PLEG): container finished" podID="3244ef77-7b63-45b9-9b12-2b12cb6654df" containerID="0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f" exitCode=0 Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.566964 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" event={"ID":"3244ef77-7b63-45b9-9b12-2b12cb6654df","Type":"ContainerDied","Data":"0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f"} Feb 17 09:05:51 crc kubenswrapper[4848]: E0217 09:05:51.579089 4848 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-crc\" already exists" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.600909 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.600970 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.600981 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.601002 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.601016 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:51Z","lastTransitionTime":"2026-02-17T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.603378 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.626729 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.646519 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.666288 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.681263 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.701613 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.704207 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.704234 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.704245 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.704260 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.704271 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:51Z","lastTransitionTime":"2026-02-17T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.716090 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.726044 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.738466 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.753393 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.771066 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.792442 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.805471 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.807515 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.807565 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.807581 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.807603 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.807620 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:51Z","lastTransitionTime":"2026-02-17T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.820682 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:51Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.909677 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.909711 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.909723 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.909738 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:51 crc kubenswrapper[4848]: I0217 09:05:51.909750 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:51Z","lastTransitionTime":"2026-02-17T09:05:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.012406 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.012461 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.012484 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.012512 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.012534 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:52Z","lastTransitionTime":"2026-02-17T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.116114 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.116156 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.116167 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.116185 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.116201 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:52Z","lastTransitionTime":"2026-02-17T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.219587 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.219631 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.219641 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.219656 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.219667 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:52Z","lastTransitionTime":"2026-02-17T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.322122 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.322179 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.322196 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.322220 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.322241 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:52Z","lastTransitionTime":"2026-02-17T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.334961 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 13:28:40.011517664 +0000 UTC Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.425380 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.425427 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.425440 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.425459 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.425472 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:52Z","lastTransitionTime":"2026-02-17T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.528888 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.528959 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.528977 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.528998 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.529011 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:52Z","lastTransitionTime":"2026-02-17T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.573392 4848 generic.go:334] "Generic (PLEG): container finished" podID="3244ef77-7b63-45b9-9b12-2b12cb6654df" containerID="307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722" exitCode=0 Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.573484 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" event={"ID":"3244ef77-7b63-45b9-9b12-2b12cb6654df","Type":"ContainerDied","Data":"307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722"} Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.579860 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerStarted","Data":"d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5"} Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.596611 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.620391 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.631901 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.631938 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.631950 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.631967 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.631979 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:52Z","lastTransitionTime":"2026-02-17T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.639129 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.665011 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.690173 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.705561 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.724045 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.734578 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.734622 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.734634 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.734650 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.734661 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:52Z","lastTransitionTime":"2026-02-17T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.738550 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.755350 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.769404 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.782409 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.796837 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.809216 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.820062 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:52Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.838238 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.838310 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.838325 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.838352 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.838369 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:52Z","lastTransitionTime":"2026-02-17T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.941388 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.941447 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.941463 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.941489 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:52 crc kubenswrapper[4848]: I0217 09:05:52.941507 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:52Z","lastTransitionTime":"2026-02-17T09:05:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.050313 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.050367 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.050381 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.050400 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.050417 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:53Z","lastTransitionTime":"2026-02-17T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.055932 4848 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.154854 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.154894 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.154905 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.154921 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.154933 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:53Z","lastTransitionTime":"2026-02-17T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.264999 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.265057 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.265069 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.265093 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.265107 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:53Z","lastTransitionTime":"2026-02-17T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.335822 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 23:34:36.076648229 +0000 UTC Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.370423 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.370482 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.370501 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.370525 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.370542 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:53Z","lastTransitionTime":"2026-02-17T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.383194 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:05:53 crc kubenswrapper[4848]: E0217 09:05:53.383345 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.383967 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:05:53 crc kubenswrapper[4848]: E0217 09:05:53.384060 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.384134 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:05:53 crc kubenswrapper[4848]: E0217 09:05:53.384211 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.406007 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.428275 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.447357 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.462937 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.472237 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.472300 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.472313 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.472333 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.472347 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:53Z","lastTransitionTime":"2026-02-17T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.477213 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.490271 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.503914 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.516833 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.535382 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.567252 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.574623 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.574709 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.574735 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.574791 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.574814 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:53Z","lastTransitionTime":"2026-02-17T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.585854 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.587736 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" event={"ID":"3244ef77-7b63-45b9-9b12-2b12cb6654df","Type":"ContainerStarted","Data":"22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4"} Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.604191 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.621687 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.633449 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.645243 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.658574 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.676532 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.677068 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.677200 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.677281 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.677371 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.677457 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:53Z","lastTransitionTime":"2026-02-17T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.687035 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.710059 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.726476 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.739303 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.753102 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.767295 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.779923 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.779988 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.780006 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.780032 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.780048 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:53Z","lastTransitionTime":"2026-02-17T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.783162 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.799880 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.817510 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.832663 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.847656 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:53Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.882829 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.882875 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.882887 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.882927 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.882940 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:53Z","lastTransitionTime":"2026-02-17T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.985369 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.985456 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.985496 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.985529 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:53 crc kubenswrapper[4848]: I0217 09:05:53.985551 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:53Z","lastTransitionTime":"2026-02-17T09:05:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.088947 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.089003 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.089016 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.089033 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.089046 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:54Z","lastTransitionTime":"2026-02-17T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.191377 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.191445 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.191462 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.191493 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.191510 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:54Z","lastTransitionTime":"2026-02-17T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.294093 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.294170 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.294194 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.294228 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.294251 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:54Z","lastTransitionTime":"2026-02-17T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.337569 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 12:06:00.890522389 +0000 UTC Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.397693 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.397804 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.397829 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.397858 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.397881 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:54Z","lastTransitionTime":"2026-02-17T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.501107 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.501161 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.501177 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.501202 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.501219 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:54Z","lastTransitionTime":"2026-02-17T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.598184 4848 generic.go:334] "Generic (PLEG): container finished" podID="3244ef77-7b63-45b9-9b12-2b12cb6654df" containerID="22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4" exitCode=0 Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.598262 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" event={"ID":"3244ef77-7b63-45b9-9b12-2b12cb6654df","Type":"ContainerDied","Data":"22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4"} Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.606474 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.606520 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.606537 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.606559 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.606577 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:54Z","lastTransitionTime":"2026-02-17T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.621192 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.640755 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.662013 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.691961 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.710249 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.710297 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.710311 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.710330 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.710344 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:54Z","lastTransitionTime":"2026-02-17T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.718485 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.739109 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.752451 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.767957 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.786535 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.800866 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.813100 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.813136 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.813147 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.813164 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.813176 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:54Z","lastTransitionTime":"2026-02-17T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.815491 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.832274 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.847285 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.859873 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:54Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.915896 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.915941 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.915955 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.915973 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:54 crc kubenswrapper[4848]: I0217 09:05:54.915988 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:54Z","lastTransitionTime":"2026-02-17T09:05:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.019121 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.019181 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.019204 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.019233 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.019256 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:55Z","lastTransitionTime":"2026-02-17T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.122118 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.122194 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.122220 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.122251 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.122274 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:55Z","lastTransitionTime":"2026-02-17T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.225443 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.225499 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.225514 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.225535 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.225554 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:55Z","lastTransitionTime":"2026-02-17T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.329112 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.329179 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.329199 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.329223 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.329241 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:55Z","lastTransitionTime":"2026-02-17T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.338030 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 09:45:19.71897716 +0000 UTC Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.382740 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.382812 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:05:55 crc kubenswrapper[4848]: E0217 09:05:55.382989 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:05:55 crc kubenswrapper[4848]: E0217 09:05:55.383245 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.383414 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:05:55 crc kubenswrapper[4848]: E0217 09:05:55.383605 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.436914 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.436958 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.436968 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.436984 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.436994 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:55Z","lastTransitionTime":"2026-02-17T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.539373 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.539422 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.539433 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.539451 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.539463 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:55Z","lastTransitionTime":"2026-02-17T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.607645 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerStarted","Data":"04d165b858721fe2f99d132e149cf569d192525bfe1a9e8e57b9c8a9a2e9b716"} Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.608079 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.613603 4848 generic.go:334] "Generic (PLEG): container finished" podID="3244ef77-7b63-45b9-9b12-2b12cb6654df" containerID="831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e" exitCode=0 Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.613654 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" event={"ID":"3244ef77-7b63-45b9-9b12-2b12cb6654df","Type":"ContainerDied","Data":"831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e"} Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.628847 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.656823 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.657189 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.657239 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.657256 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.657282 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.657299 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:55Z","lastTransitionTime":"2026-02-17T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.661255 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.669219 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.681240 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.693954 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.707019 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.721018 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.736850 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.750046 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.759419 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.759442 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.759453 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.759468 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.759477 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:55Z","lastTransitionTime":"2026-02-17T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.760679 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.777647 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04d165b858721fe2f99d132e149cf569d192525bfe1a9e8e57b9c8a9a2e9b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.789889 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.801876 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.815567 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.831751 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.846705 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.861662 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.861712 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.861722 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.861737 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.861750 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:55Z","lastTransitionTime":"2026-02-17T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.864347 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.883978 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.898194 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.913146 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.924808 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.939086 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.953190 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.964554 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.964599 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.964612 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.964628 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.964639 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:55Z","lastTransitionTime":"2026-02-17T09:05:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.965674 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.977753 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:55 crc kubenswrapper[4848]: I0217 09:05:55.992357 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:55Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.025958 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04d165b858721fe2f99d132e149cf569d192525bfe1a9e8e57b9c8a9a2e9b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.049802 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.067320 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.067356 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.067367 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.067384 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.067396 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:56Z","lastTransitionTime":"2026-02-17T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.169714 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.169807 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.169832 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.169862 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.169882 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:56Z","lastTransitionTime":"2026-02-17T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.273222 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.273335 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.273358 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.273406 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.273429 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:56Z","lastTransitionTime":"2026-02-17T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.338748 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 12:59:32.251502397 +0000 UTC Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.375934 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.375983 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.375999 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.376022 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.376040 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:56Z","lastTransitionTime":"2026-02-17T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.479194 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.479237 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.479248 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.479263 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.479275 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:56Z","lastTransitionTime":"2026-02-17T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.582097 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.582165 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.582185 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.582211 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.582232 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:56Z","lastTransitionTime":"2026-02-17T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.624169 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" event={"ID":"3244ef77-7b63-45b9-9b12-2b12cb6654df","Type":"ContainerStarted","Data":"d4a85b1d98b8892e543f98f9282c0894c6404171391a3943a85226186e43b66b"} Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.624283 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.624872 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.647877 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.664824 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.671721 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.685222 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.685295 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.685319 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.685352 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.685376 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:56Z","lastTransitionTime":"2026-02-17T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.696978 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.715703 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.752382 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04d165b858721fe2f99d132e149cf569d192525bfe1a9e8e57b9c8a9a2e9b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.782755 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a85b1d98b8892e543f98f9282c0894c6404171391a3943a85226186e43b66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.788876 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.788923 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.788935 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.788953 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.788965 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:56Z","lastTransitionTime":"2026-02-17T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.806788 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.823905 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.841270 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.862612 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.879032 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.891068 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.892016 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.892066 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.892090 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.892115 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.892128 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:56Z","lastTransitionTime":"2026-02-17T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.911192 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.925494 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.939266 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.952066 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.963995 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.977489 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.993045 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:56Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.994950 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.995015 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.995035 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.995060 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:56 crc kubenswrapper[4848]: I0217 09:05:56.995078 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:56Z","lastTransitionTime":"2026-02-17T09:05:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.009546 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:57Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.053095 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:57Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.077987 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:57Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.092248 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:57Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.097221 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.097263 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.097276 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.097294 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.097311 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:57Z","lastTransitionTime":"2026-02-17T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.114775 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04d165b858721fe2f99d132e149cf569d192525bfe1a9e8e57b9c8a9a2e9b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:57Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.132573 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a85b1d98b8892e543f98f9282c0894c6404171391a3943a85226186e43b66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:57Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.152575 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:57Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.167705 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:57Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.181570 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:57Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.199423 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.199465 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.199477 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.199492 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.199501 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:57Z","lastTransitionTime":"2026-02-17T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.302219 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.302280 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.302295 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.302319 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.302334 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:57Z","lastTransitionTime":"2026-02-17T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.339122 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 21:43:31.737958513 +0000 UTC Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.382550 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:05:57 crc kubenswrapper[4848]: E0217 09:05:57.382811 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.383024 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.383097 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:05:57 crc kubenswrapper[4848]: E0217 09:05:57.383171 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:05:57 crc kubenswrapper[4848]: E0217 09:05:57.383261 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.405161 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.405192 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.405203 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.405219 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.405230 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:57Z","lastTransitionTime":"2026-02-17T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.507835 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.507891 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.507908 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.507932 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.507950 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:57Z","lastTransitionTime":"2026-02-17T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.610667 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.610718 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.610731 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.610746 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.610782 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:57Z","lastTransitionTime":"2026-02-17T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.628892 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.713424 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.713452 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.713460 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.713476 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.713485 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:57Z","lastTransitionTime":"2026-02-17T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.815911 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.815978 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.815996 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.816023 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.816040 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:57Z","lastTransitionTime":"2026-02-17T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.918723 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.918845 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.918867 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.918891 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:57 crc kubenswrapper[4848]: I0217 09:05:57.918910 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:57Z","lastTransitionTime":"2026-02-17T09:05:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.021986 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.022029 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.022046 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.022064 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.022075 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:58Z","lastTransitionTime":"2026-02-17T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.124879 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.124926 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.124940 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.124961 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.124976 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:58Z","lastTransitionTime":"2026-02-17T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.228157 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.228412 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.228492 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.228584 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.228662 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:58Z","lastTransitionTime":"2026-02-17T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.332274 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.332357 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.332374 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.332399 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.332416 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:58Z","lastTransitionTime":"2026-02-17T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.339849 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 10:42:54.79576117 +0000 UTC Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.357453 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.380396 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.401050 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.419485 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.435132 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.435192 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.435210 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.435239 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.435260 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:58Z","lastTransitionTime":"2026-02-17T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.437874 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.458661 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.479876 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.502486 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.520437 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.534655 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.538276 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.538389 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.538416 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.538451 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.538475 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:58Z","lastTransitionTime":"2026-02-17T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.559479 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04d165b858721fe2f99d132e149cf569d192525bfe1a9e8e57b9c8a9a2e9b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.583849 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a85b1d98b8892e543f98f9282c0894c6404171391a3943a85226186e43b66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.605310 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.625138 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.635223 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4fvgf_2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8/ovnkube-controller/0.log" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.639804 4848 generic.go:334] "Generic (PLEG): container finished" podID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerID="04d165b858721fe2f99d132e149cf569d192525bfe1a9e8e57b9c8a9a2e9b716" exitCode=1 Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.639863 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerDied","Data":"04d165b858721fe2f99d132e149cf569d192525bfe1a9e8e57b9c8a9a2e9b716"} Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.640534 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.640579 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.640602 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.640629 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.640652 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:58Z","lastTransitionTime":"2026-02-17T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.641209 4848 scope.go:117] "RemoveContainer" containerID="04d165b858721fe2f99d132e149cf569d192525bfe1a9e8e57b9c8a9a2e9b716" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.646103 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.666791 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.687168 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.703514 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.720937 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.743731 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.743976 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.744101 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.744223 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.744336 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:58Z","lastTransitionTime":"2026-02-17T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.744608 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04d165b858721fe2f99d132e149cf569d192525bfe1a9e8e57b9c8a9a2e9b716\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d165b858721fe2f99d132e149cf569d192525bfe1a9e8e57b9c8a9a2e9b716\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:05:57Z\\\",\\\"message\\\":\\\".AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 09:05:57.841589 6151 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:57.841845 6151 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 09:05:57.841850 6151 factory.go:656] Stopping watch factory\\\\nI0217 09:05:57.841921 6151 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 09:05:57.841647 6151 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:57.841985 6151 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 09:05:57.841927 6151 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:57.842331 6151 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:57.842420 6151 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 09:05:57.842447 6151 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.768497 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a85b1d98b8892e543f98f9282c0894c6404171391a3943a85226186e43b66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.785486 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.824424 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.842922 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.846971 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.847005 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.847015 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.847033 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.847047 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:58Z","lastTransitionTime":"2026-02-17T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.859826 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.880218 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.931030 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.941427 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.949678 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.949729 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.949739 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.949754 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.949775 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:58Z","lastTransitionTime":"2026-02-17T09:05:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:58 crc kubenswrapper[4848]: I0217 09:05:58.952557 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:58Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.047009 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.047124 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.047150 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.047169 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.047193 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:05:59 crc kubenswrapper[4848]: E0217 09:05:59.047265 4848 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 09:05:59 crc kubenswrapper[4848]: E0217 09:05:59.047309 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 09:06:15.047296408 +0000 UTC m=+52.590552054 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 09:05:59 crc kubenswrapper[4848]: E0217 09:05:59.048307 4848 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 09:05:59 crc kubenswrapper[4848]: E0217 09:05:59.048346 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 09:06:15.048335797 +0000 UTC m=+52.591591433 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 09:05:59 crc kubenswrapper[4848]: E0217 09:05:59.048410 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 09:05:59 crc kubenswrapper[4848]: E0217 09:05:59.048423 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 09:05:59 crc kubenswrapper[4848]: E0217 09:05:59.048435 4848 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:05:59 crc kubenswrapper[4848]: E0217 09:05:59.048463 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:06:15.048417689 +0000 UTC m=+52.591673395 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:05:59 crc kubenswrapper[4848]: E0217 09:05:59.048526 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 09:06:15.048505552 +0000 UTC m=+52.591761348 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:05:59 crc kubenswrapper[4848]: E0217 09:05:59.048569 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 09:05:59 crc kubenswrapper[4848]: E0217 09:05:59.048595 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 09:05:59 crc kubenswrapper[4848]: E0217 09:05:59.048610 4848 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:05:59 crc kubenswrapper[4848]: E0217 09:05:59.048664 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 09:06:15.048647765 +0000 UTC m=+52.591903421 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.052124 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.052183 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.052208 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.052239 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.052263 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:59Z","lastTransitionTime":"2026-02-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.158048 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.158098 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.158108 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.158122 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.158131 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:59Z","lastTransitionTime":"2026-02-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.260185 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.260229 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.260238 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.260262 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.260271 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:59Z","lastTransitionTime":"2026-02-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.340890 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 00:25:16.827738749 +0000 UTC Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.363203 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.363263 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.363280 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.363303 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.363320 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:59Z","lastTransitionTime":"2026-02-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.382927 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.382949 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.383179 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:05:59 crc kubenswrapper[4848]: E0217 09:05:59.383095 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:05:59 crc kubenswrapper[4848]: E0217 09:05:59.383317 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:05:59 crc kubenswrapper[4848]: E0217 09:05:59.383478 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.466144 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.466197 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.466206 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.466220 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.466228 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:59Z","lastTransitionTime":"2026-02-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.564567 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.564621 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.564635 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.564660 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.564675 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:59Z","lastTransitionTime":"2026-02-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:59 crc kubenswrapper[4848]: E0217 09:05:59.585117 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:59Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.589237 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.589412 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.589526 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.589665 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.589810 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:59Z","lastTransitionTime":"2026-02-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:59 crc kubenswrapper[4848]: E0217 09:05:59.608187 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:59Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.612690 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.612894 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.613033 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.613186 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.613322 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:59Z","lastTransitionTime":"2026-02-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:59 crc kubenswrapper[4848]: E0217 09:05:59.628399 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:59Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.632375 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.632434 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.632453 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.632478 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.632497 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:59Z","lastTransitionTime":"2026-02-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.645596 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4fvgf_2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8/ovnkube-controller/0.log" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.649850 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerStarted","Data":"dfd3aa1d845e743861894094c9d5835ee87ee37e05d0a45c6c00339742aa5466"} Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.649976 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 09:05:59 crc kubenswrapper[4848]: E0217 09:05:59.651341 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:59Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.656265 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.656298 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.656314 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.656336 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.656353 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:59Z","lastTransitionTime":"2026-02-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.666021 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:59Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:59 crc kubenswrapper[4848]: E0217 09:05:59.671260 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:59Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:59 crc kubenswrapper[4848]: E0217 09:05:59.671541 4848 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.673219 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.673265 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.673282 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.673302 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.673318 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:59Z","lastTransitionTime":"2026-02-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.679320 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:59Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.693665 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:59Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.706306 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:59Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.719428 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:59Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.730984 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:59Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.745583 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:59Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.758804 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:59Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.772672 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:59Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.775175 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.775253 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.775273 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.775300 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.775318 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:59Z","lastTransitionTime":"2026-02-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.785290 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:59Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.799852 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:59Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.811061 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:59Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.834506 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfd3aa1d845e743861894094c9d5835ee87ee37e05d0a45c6c00339742aa5466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d165b858721fe2f99d132e149cf569d192525bfe1a9e8e57b9c8a9a2e9b716\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:05:57Z\\\",\\\"message\\\":\\\".AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 09:05:57.841589 6151 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:57.841845 6151 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 09:05:57.841850 6151 factory.go:656] Stopping watch factory\\\\nI0217 09:05:57.841921 6151 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 09:05:57.841647 6151 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:57.841985 6151 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 09:05:57.841927 6151 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:57.842331 6151 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:57.842420 6151 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 09:05:57.842447 6151 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:59Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.848899 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a85b1d98b8892e543f98f9282c0894c6404171391a3943a85226186e43b66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:05:59Z is after 2025-08-24T17:21:41Z" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.878404 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.878593 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.878658 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.878736 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.878834 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:59Z","lastTransitionTime":"2026-02-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.980737 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.980794 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.980806 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.980839 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:05:59 crc kubenswrapper[4848]: I0217 09:05:59.980850 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:05:59Z","lastTransitionTime":"2026-02-17T09:05:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.083556 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.083597 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.083608 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.083625 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.083637 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:00Z","lastTransitionTime":"2026-02-17T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.186038 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.186394 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.186525 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.186645 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.186794 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:00Z","lastTransitionTime":"2026-02-17T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.290346 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.290396 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.290414 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.290434 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.290449 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:00Z","lastTransitionTime":"2026-02-17T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.341476 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 18:55:28.345745672 +0000 UTC Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.393980 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.394361 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.394566 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.394708 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.394880 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:00Z","lastTransitionTime":"2026-02-17T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.498626 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.498697 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.498715 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.498742 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.498789 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:00Z","lastTransitionTime":"2026-02-17T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.602171 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.602544 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.602711 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.602929 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.603107 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:00Z","lastTransitionTime":"2026-02-17T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.660181 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4fvgf_2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8/ovnkube-controller/1.log" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.661369 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4fvgf_2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8/ovnkube-controller/0.log" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.665455 4848 generic.go:334] "Generic (PLEG): container finished" podID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerID="dfd3aa1d845e743861894094c9d5835ee87ee37e05d0a45c6c00339742aa5466" exitCode=1 Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.665534 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerDied","Data":"dfd3aa1d845e743861894094c9d5835ee87ee37e05d0a45c6c00339742aa5466"} Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.665614 4848 scope.go:117] "RemoveContainer" containerID="04d165b858721fe2f99d132e149cf569d192525bfe1a9e8e57b9c8a9a2e9b716" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.666857 4848 scope.go:117] "RemoveContainer" containerID="dfd3aa1d845e743861894094c9d5835ee87ee37e05d0a45c6c00339742aa5466" Feb 17 09:06:00 crc kubenswrapper[4848]: E0217 09:06:00.667214 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4fvgf_openshift-ovn-kubernetes(2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.687965 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.706326 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.706358 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.706366 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.706382 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.706393 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:00Z","lastTransitionTime":"2026-02-17T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.706975 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.723423 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.741386 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.759365 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.775878 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.788686 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv"] Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.789856 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.792170 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.793038 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.800027 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.809100 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.809147 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.809166 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.809197 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.809223 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:00Z","lastTransitionTime":"2026-02-17T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.819661 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.834132 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.863529 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfd3aa1d845e743861894094c9d5835ee87ee37e05d0a45c6c00339742aa5466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d165b858721fe2f99d132e149cf569d192525bfe1a9e8e57b9c8a9a2e9b716\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:05:57Z\\\",\\\"message\\\":\\\".AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 09:05:57.841589 6151 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:57.841845 6151 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 09:05:57.841850 6151 factory.go:656] Stopping watch factory\\\\nI0217 09:05:57.841921 6151 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 09:05:57.841647 6151 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:57.841985 6151 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 09:05:57.841927 6151 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:57.842331 6151 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:57.842420 6151 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 09:05:57.842447 6151 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfd3aa1d845e743861894094c9d5835ee87ee37e05d0a45c6c00339742aa5466\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"6298 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 09:05:59.605652 6298 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 09:05:59.605698 6298 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 09:05:59.605746 6298 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 09:05:59.605455 6298 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.605884 6298 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 09:05:59.605117 6298 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 09:05:59.605123 6298 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.604852 6298 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.605062 6298 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.606082 6298 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0217 09:05:59.606302 6298 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.886369 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a85b1d98b8892e543f98f9282c0894c6404171391a3943a85226186e43b66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.904341 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.912216 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.912256 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.912270 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.912293 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.912305 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:00Z","lastTransitionTime":"2026-02-17T09:06:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.920123 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.934351 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.949007 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.965397 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.966965 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d7de1a57-0b76-454f-bc44-ad6632f90e5a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-65rnv\" (UID: \"d7de1a57-0b76-454f-bc44-ad6632f90e5a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.967034 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs96q\" (UniqueName: \"kubernetes.io/projected/d7de1a57-0b76-454f-bc44-ad6632f90e5a-kube-api-access-bs96q\") pod \"ovnkube-control-plane-749d76644c-65rnv\" (UID: \"d7de1a57-0b76-454f-bc44-ad6632f90e5a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.967076 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d7de1a57-0b76-454f-bc44-ad6632f90e5a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-65rnv\" (UID: \"d7de1a57-0b76-454f-bc44-ad6632f90e5a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.967175 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d7de1a57-0b76-454f-bc44-ad6632f90e5a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-65rnv\" (UID: \"d7de1a57-0b76-454f-bc44-ad6632f90e5a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.976498 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:00 crc kubenswrapper[4848]: I0217 09:06:00.998981 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7de1a57-0b76-454f-bc44-ad6632f90e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:00Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:00Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-65rnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:00Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.014036 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.015240 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.015295 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.015313 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.015338 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.015355 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:01Z","lastTransitionTime":"2026-02-17T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.029877 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.050897 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.068197 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d7de1a57-0b76-454f-bc44-ad6632f90e5a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-65rnv\" (UID: \"d7de1a57-0b76-454f-bc44-ad6632f90e5a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.068237 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs96q\" (UniqueName: \"kubernetes.io/projected/d7de1a57-0b76-454f-bc44-ad6632f90e5a-kube-api-access-bs96q\") pod \"ovnkube-control-plane-749d76644c-65rnv\" (UID: \"d7de1a57-0b76-454f-bc44-ad6632f90e5a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.068266 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d7de1a57-0b76-454f-bc44-ad6632f90e5a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-65rnv\" (UID: \"d7de1a57-0b76-454f-bc44-ad6632f90e5a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.068293 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d7de1a57-0b76-454f-bc44-ad6632f90e5a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-65rnv\" (UID: \"d7de1a57-0b76-454f-bc44-ad6632f90e5a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.069189 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d7de1a57-0b76-454f-bc44-ad6632f90e5a-env-overrides\") pod \"ovnkube-control-plane-749d76644c-65rnv\" (UID: \"d7de1a57-0b76-454f-bc44-ad6632f90e5a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.069282 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d7de1a57-0b76-454f-bc44-ad6632f90e5a-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-65rnv\" (UID: \"d7de1a57-0b76-454f-bc44-ad6632f90e5a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.072696 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.077734 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d7de1a57-0b76-454f-bc44-ad6632f90e5a-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-65rnv\" (UID: \"d7de1a57-0b76-454f-bc44-ad6632f90e5a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.089000 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.099581 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs96q\" (UniqueName: \"kubernetes.io/projected/d7de1a57-0b76-454f-bc44-ad6632f90e5a-kube-api-access-bs96q\") pod \"ovnkube-control-plane-749d76644c-65rnv\" (UID: \"d7de1a57-0b76-454f-bc44-ad6632f90e5a\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.106611 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.113387 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.135830 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfd3aa1d845e743861894094c9d5835ee87ee37e05d0a45c6c00339742aa5466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d165b858721fe2f99d132e149cf569d192525bfe1a9e8e57b9c8a9a2e9b716\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:05:57Z\\\",\\\"message\\\":\\\".AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 09:05:57.841589 6151 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:57.841845 6151 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 09:05:57.841850 6151 factory.go:656] Stopping watch factory\\\\nI0217 09:05:57.841921 6151 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 09:05:57.841647 6151 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:57.841985 6151 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 09:05:57.841927 6151 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:57.842331 6151 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:57.842420 6151 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 09:05:57.842447 6151 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfd3aa1d845e743861894094c9d5835ee87ee37e05d0a45c6c00339742aa5466\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"6298 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 09:05:59.605652 6298 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 09:05:59.605698 6298 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 09:05:59.605746 6298 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 09:05:59.605455 6298 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.605884 6298 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 09:05:59.605117 6298 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 09:05:59.605123 6298 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.604852 6298 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.605062 6298 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.606082 6298 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0217 09:05:59.606302 6298 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.160521 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a85b1d98b8892e543f98f9282c0894c6404171391a3943a85226186e43b66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.161219 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.161263 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.161279 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.161300 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.161317 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:01Z","lastTransitionTime":"2026-02-17T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.186053 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.204665 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.217644 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.263792 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.263846 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.263867 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.263892 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.263908 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:01Z","lastTransitionTime":"2026-02-17T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.342185 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-28 21:04:16.671903304 +0000 UTC Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.367253 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.367309 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.367324 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.367422 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.367441 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:01Z","lastTransitionTime":"2026-02-17T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.383315 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:01 crc kubenswrapper[4848]: E0217 09:06:01.383430 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.383660 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.383809 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:01 crc kubenswrapper[4848]: E0217 09:06:01.383980 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:01 crc kubenswrapper[4848]: E0217 09:06:01.384135 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.475325 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.475387 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.475406 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.475432 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.475451 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:01Z","lastTransitionTime":"2026-02-17T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.579441 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.579807 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.579961 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.580111 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.580241 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:01Z","lastTransitionTime":"2026-02-17T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.678052 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" event={"ID":"d7de1a57-0b76-454f-bc44-ad6632f90e5a","Type":"ContainerStarted","Data":"9eda2256c29f83ccd246bf001fea4d97537c17b395496e4334c79203d7290123"} Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.678738 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" event={"ID":"d7de1a57-0b76-454f-bc44-ad6632f90e5a","Type":"ContainerStarted","Data":"a0e5115c6c6df4b20118349fec582725156c26e2dbc6108060f6712eff903c33"} Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.678924 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" event={"ID":"d7de1a57-0b76-454f-bc44-ad6632f90e5a","Type":"ContainerStarted","Data":"534fd5f344f6cdcd40f5e57388cef865ced16a06e843024683786a989c27f63f"} Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.680338 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4fvgf_2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8/ovnkube-controller/1.log" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.682641 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.682677 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.682693 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.682716 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.682733 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:01Z","lastTransitionTime":"2026-02-17T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.684331 4848 scope.go:117] "RemoveContainer" containerID="dfd3aa1d845e743861894094c9d5835ee87ee37e05d0a45c6c00339742aa5466" Feb 17 09:06:01 crc kubenswrapper[4848]: E0217 09:06:01.684509 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4fvgf_openshift-ovn-kubernetes(2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.703340 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.721176 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7de1a57-0b76-454f-bc44-ad6632f90e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e5115c6c6df4b20118349fec582725156c26e2dbc6108060f6712eff903c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eda2256c29f83ccd246bf001fea4d97537c17b395496e4334c79203d7290123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-65rnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.738829 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.755466 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.771849 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.785464 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.785505 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.785516 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.785530 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.785545 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:01Z","lastTransitionTime":"2026-02-17T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.792348 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.811026 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.830380 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.846986 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.862078 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.879214 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.892436 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.892466 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.892474 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.892492 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.892501 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:01Z","lastTransitionTime":"2026-02-17T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.893320 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.919298 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfd3aa1d845e743861894094c9d5835ee87ee37e05d0a45c6c00339742aa5466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04d165b858721fe2f99d132e149cf569d192525bfe1a9e8e57b9c8a9a2e9b716\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:05:57Z\\\",\\\"message\\\":\\\".AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0217 09:05:57.841589 6151 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:57.841845 6151 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 09:05:57.841850 6151 factory.go:656] Stopping watch factory\\\\nI0217 09:05:57.841921 6151 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 09:05:57.841647 6151 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:57.841985 6151 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 09:05:57.841927 6151 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:57.842331 6151 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:57.842420 6151 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0217 09:05:57.842447 6151 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfd3aa1d845e743861894094c9d5835ee87ee37e05d0a45c6c00339742aa5466\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"6298 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 09:05:59.605652 6298 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 09:05:59.605698 6298 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 09:05:59.605746 6298 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 09:05:59.605455 6298 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.605884 6298 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 09:05:59.605117 6298 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 09:05:59.605123 6298 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.604852 6298 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.605062 6298 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.606082 6298 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0217 09:05:59.606302 6298 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.937599 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a85b1d98b8892e543f98f9282c0894c6404171391a3943a85226186e43b66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.947897 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-78r6x"] Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.948321 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:01 crc kubenswrapper[4848]: E0217 09:06:01.948373 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.952568 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.981462 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfd3aa1d845e743861894094c9d5835ee87ee37e05d0a45c6c00339742aa5466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfd3aa1d845e743861894094c9d5835ee87ee37e05d0a45c6c00339742aa5466\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"6298 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 09:05:59.605652 6298 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 09:05:59.605698 6298 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 09:05:59.605746 6298 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 09:05:59.605455 6298 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.605884 6298 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 09:05:59.605117 6298 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 09:05:59.605123 6298 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.604852 6298 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.605062 6298 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.606082 6298 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0217 09:05:59.606302 6298 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4fvgf_openshift-ovn-kubernetes(2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.994833 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.994918 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.994941 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.994972 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:01 crc kubenswrapper[4848]: I0217 09:06:01.994994 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:01Z","lastTransitionTime":"2026-02-17T09:06:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.002471 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a85b1d98b8892e543f98f9282c0894c6404171391a3943a85226186e43b66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:01Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.021298 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:02Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.042694 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:02Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.057406 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:02Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.078338 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq5wq\" (UniqueName: \"kubernetes.io/projected/98bfddd8-4a1a-4b90-973a-adb75b02fdba-kube-api-access-fq5wq\") pod \"network-metrics-daemon-78r6x\" (UID: \"98bfddd8-4a1a-4b90-973a-adb75b02fdba\") " pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.078456 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs\") pod \"network-metrics-daemon-78r6x\" (UID: \"98bfddd8-4a1a-4b90-973a-adb75b02fdba\") " pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.078987 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78r6x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98bfddd8-4a1a-4b90-973a-adb75b02fdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78r6x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:02Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.098601 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.098639 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.098648 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.098662 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.098672 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:02Z","lastTransitionTime":"2026-02-17T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.101263 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:02Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.136170 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:02Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.146885 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:02Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.157895 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7de1a57-0b76-454f-bc44-ad6632f90e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e5115c6c6df4b20118349fec582725156c26e2dbc6108060f6712eff903c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eda2256c29f83ccd246bf001fea4d97537c17b395496e4334c79203d7290123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-65rnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:02Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.169868 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:02Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.179546 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq5wq\" (UniqueName: \"kubernetes.io/projected/98bfddd8-4a1a-4b90-973a-adb75b02fdba-kube-api-access-fq5wq\") pod \"network-metrics-daemon-78r6x\" (UID: \"98bfddd8-4a1a-4b90-973a-adb75b02fdba\") " pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.179584 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs\") pod \"network-metrics-daemon-78r6x\" (UID: \"98bfddd8-4a1a-4b90-973a-adb75b02fdba\") " pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:02 crc kubenswrapper[4848]: E0217 09:06:02.179701 4848 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 09:06:02 crc kubenswrapper[4848]: E0217 09:06:02.179743 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs podName:98bfddd8-4a1a-4b90-973a-adb75b02fdba nodeName:}" failed. No retries permitted until 2026-02-17 09:06:02.679731354 +0000 UTC m=+40.222987000 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs") pod "network-metrics-daemon-78r6x" (UID: "98bfddd8-4a1a-4b90-973a-adb75b02fdba") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.180207 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:02Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.192484 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:02Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.200728 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.200754 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.200775 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.200788 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.200797 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:02Z","lastTransitionTime":"2026-02-17T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.205085 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:02Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.209168 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq5wq\" (UniqueName: \"kubernetes.io/projected/98bfddd8-4a1a-4b90-973a-adb75b02fdba-kube-api-access-fq5wq\") pod \"network-metrics-daemon-78r6x\" (UID: \"98bfddd8-4a1a-4b90-973a-adb75b02fdba\") " pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.215169 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:02Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.224923 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:02Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.304317 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.304380 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.304402 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.304432 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.304452 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:02Z","lastTransitionTime":"2026-02-17T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.342730 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 23:36:05.440388749 +0000 UTC Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.407852 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.407886 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.407893 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.407906 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.407914 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:02Z","lastTransitionTime":"2026-02-17T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.510944 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.511383 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.511553 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.511704 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.511912 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:02Z","lastTransitionTime":"2026-02-17T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.615611 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.615671 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.615688 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.615711 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.615729 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:02Z","lastTransitionTime":"2026-02-17T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.684661 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs\") pod \"network-metrics-daemon-78r6x\" (UID: \"98bfddd8-4a1a-4b90-973a-adb75b02fdba\") " pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:02 crc kubenswrapper[4848]: E0217 09:06:02.684879 4848 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 09:06:02 crc kubenswrapper[4848]: E0217 09:06:02.685633 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs podName:98bfddd8-4a1a-4b90-973a-adb75b02fdba nodeName:}" failed. No retries permitted until 2026-02-17 09:06:03.685598511 +0000 UTC m=+41.228854167 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs") pod "network-metrics-daemon-78r6x" (UID: "98bfddd8-4a1a-4b90-973a-adb75b02fdba") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.717976 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.718055 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.718074 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.718102 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.718119 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:02Z","lastTransitionTime":"2026-02-17T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.820359 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.820431 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.820456 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.820486 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.820508 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:02Z","lastTransitionTime":"2026-02-17T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.923456 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.923517 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.923535 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.923560 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:02 crc kubenswrapper[4848]: I0217 09:06:02.923579 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:02Z","lastTransitionTime":"2026-02-17T09:06:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.027862 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.027930 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.027968 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.027997 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.028020 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:03Z","lastTransitionTime":"2026-02-17T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.130901 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.130954 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.130971 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.130992 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.131010 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:03Z","lastTransitionTime":"2026-02-17T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.233381 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.233436 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.233453 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.233476 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.233494 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:03Z","lastTransitionTime":"2026-02-17T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.336109 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.336146 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.336157 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.336172 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.336183 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:03Z","lastTransitionTime":"2026-02-17T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.343735 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 15:32:02.925995365 +0000 UTC Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.383257 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.383364 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:03 crc kubenswrapper[4848]: E0217 09:06:03.383415 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.383457 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:03 crc kubenswrapper[4848]: E0217 09:06:03.383682 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.383706 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:03 crc kubenswrapper[4848]: E0217 09:06:03.384816 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:03 crc kubenswrapper[4848]: E0217 09:06:03.385022 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.402889 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:03Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.424867 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a85b1d98b8892e543f98f9282c0894c6404171391a3943a85226186e43b66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:03Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.438574 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.438659 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.438682 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.438712 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.438735 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:03Z","lastTransitionTime":"2026-02-17T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.446917 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:03Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.472899 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:03Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.489538 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:03Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.518581 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfd3aa1d845e743861894094c9d5835ee87ee37e05d0a45c6c00339742aa5466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfd3aa1d845e743861894094c9d5835ee87ee37e05d0a45c6c00339742aa5466\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"6298 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 09:05:59.605652 6298 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 09:05:59.605698 6298 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 09:05:59.605746 6298 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 09:05:59.605455 6298 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.605884 6298 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 09:05:59.605117 6298 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 09:05:59.605123 6298 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.604852 6298 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.605062 6298 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.606082 6298 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0217 09:05:59.606302 6298 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4fvgf_openshift-ovn-kubernetes(2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:03Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.542020 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.542059 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.542070 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.542086 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.542097 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:03Z","lastTransitionTime":"2026-02-17T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.543740 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:03Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.562324 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:03Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.578012 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:03Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.596833 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7de1a57-0b76-454f-bc44-ad6632f90e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e5115c6c6df4b20118349fec582725156c26e2dbc6108060f6712eff903c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eda2256c29f83ccd246bf001fea4d97537c17b395496e4334c79203d7290123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-65rnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:03Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.615900 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78r6x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98bfddd8-4a1a-4b90-973a-adb75b02fdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78r6x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:03Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.635851 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:03Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.644665 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.644712 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.644729 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.644752 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.644794 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:03Z","lastTransitionTime":"2026-02-17T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.657056 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:03Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.675567 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:03Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.693066 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:03Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.697562 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs\") pod \"network-metrics-daemon-78r6x\" (UID: \"98bfddd8-4a1a-4b90-973a-adb75b02fdba\") " pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:03 crc kubenswrapper[4848]: E0217 09:06:03.697893 4848 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 09:06:03 crc kubenswrapper[4848]: E0217 09:06:03.697989 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs podName:98bfddd8-4a1a-4b90-973a-adb75b02fdba nodeName:}" failed. No retries permitted until 2026-02-17 09:06:05.697963804 +0000 UTC m=+43.241219490 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs") pod "network-metrics-daemon-78r6x" (UID: "98bfddd8-4a1a-4b90-973a-adb75b02fdba") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.717259 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:03Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.747887 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.748108 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.748246 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.748384 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.748501 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:03Z","lastTransitionTime":"2026-02-17T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.851877 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.851942 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.851966 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.851996 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.852018 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:03Z","lastTransitionTime":"2026-02-17T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.955025 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.955084 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.955102 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.955138 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:03 crc kubenswrapper[4848]: I0217 09:06:03.955157 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:03Z","lastTransitionTime":"2026-02-17T09:06:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.058865 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.058942 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.058967 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.058999 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.059022 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:04Z","lastTransitionTime":"2026-02-17T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.162673 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.162741 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.162791 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.162816 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.162833 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:04Z","lastTransitionTime":"2026-02-17T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.265728 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.265783 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.265793 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.265807 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.265816 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:04Z","lastTransitionTime":"2026-02-17T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.344337 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 18:18:56.470403922 +0000 UTC Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.368885 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.369125 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.369251 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.369404 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.369595 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:04Z","lastTransitionTime":"2026-02-17T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.473071 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.473130 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.473147 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.473168 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.473185 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:04Z","lastTransitionTime":"2026-02-17T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.513591 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.514968 4848 scope.go:117] "RemoveContainer" containerID="dfd3aa1d845e743861894094c9d5835ee87ee37e05d0a45c6c00339742aa5466" Feb 17 09:06:04 crc kubenswrapper[4848]: E0217 09:06:04.515307 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4fvgf_openshift-ovn-kubernetes(2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.577019 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.577083 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.577100 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.577127 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.577145 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:04Z","lastTransitionTime":"2026-02-17T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.680151 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.680205 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.680221 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.680250 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.680271 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:04Z","lastTransitionTime":"2026-02-17T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.785378 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.785460 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.785483 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.785513 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.785536 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:04Z","lastTransitionTime":"2026-02-17T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.888661 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.888707 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.888720 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.888739 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.888752 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:04Z","lastTransitionTime":"2026-02-17T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.991471 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.991524 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.991543 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.991566 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:04 crc kubenswrapper[4848]: I0217 09:06:04.991582 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:04Z","lastTransitionTime":"2026-02-17T09:06:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.094100 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.094168 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.094191 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.094223 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.094246 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:05Z","lastTransitionTime":"2026-02-17T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.197639 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.197705 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.197724 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.197748 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.197796 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:05Z","lastTransitionTime":"2026-02-17T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.300142 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.300197 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.300212 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.300232 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.300246 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:05Z","lastTransitionTime":"2026-02-17T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.345315 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 09:45:05.995365087 +0000 UTC Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.384006 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.384057 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.384120 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:05 crc kubenswrapper[4848]: E0217 09:06:05.384207 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.384277 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:05 crc kubenswrapper[4848]: E0217 09:06:05.384469 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:05 crc kubenswrapper[4848]: E0217 09:06:05.384575 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:05 crc kubenswrapper[4848]: E0217 09:06:05.384653 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.403032 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.403133 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.403152 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.403203 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.403246 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:05Z","lastTransitionTime":"2026-02-17T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.505350 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.505511 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.505539 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.505616 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.505646 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:05Z","lastTransitionTime":"2026-02-17T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.609103 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.609170 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.609183 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.609200 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.609231 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:05Z","lastTransitionTime":"2026-02-17T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.713201 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.713247 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.713263 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.713281 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.713295 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:05Z","lastTransitionTime":"2026-02-17T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.722414 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs\") pod \"network-metrics-daemon-78r6x\" (UID: \"98bfddd8-4a1a-4b90-973a-adb75b02fdba\") " pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:05 crc kubenswrapper[4848]: E0217 09:06:05.722630 4848 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 09:06:05 crc kubenswrapper[4848]: E0217 09:06:05.722717 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs podName:98bfddd8-4a1a-4b90-973a-adb75b02fdba nodeName:}" failed. No retries permitted until 2026-02-17 09:06:09.722693188 +0000 UTC m=+47.265948934 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs") pod "network-metrics-daemon-78r6x" (UID: "98bfddd8-4a1a-4b90-973a-adb75b02fdba") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.820943 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.821033 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.821081 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.821106 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.821124 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:05Z","lastTransitionTime":"2026-02-17T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.923828 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.923885 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.923901 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.923927 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:05 crc kubenswrapper[4848]: I0217 09:06:05.923948 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:05Z","lastTransitionTime":"2026-02-17T09:06:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.027281 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.027375 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.027396 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.027419 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.027437 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:06Z","lastTransitionTime":"2026-02-17T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.130338 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.130382 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.130394 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.130412 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.130424 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:06Z","lastTransitionTime":"2026-02-17T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.233263 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.233331 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.233349 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.233376 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.233394 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:06Z","lastTransitionTime":"2026-02-17T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.337142 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.337216 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.337245 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.337279 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.337302 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:06Z","lastTransitionTime":"2026-02-17T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.345789 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 04:29:08.549787795 +0000 UTC Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.439191 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.439265 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.439287 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.439314 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.439333 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:06Z","lastTransitionTime":"2026-02-17T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.542125 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.542204 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.542229 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.542259 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.542286 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:06Z","lastTransitionTime":"2026-02-17T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.645610 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.645682 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.645717 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.645747 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.645800 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:06Z","lastTransitionTime":"2026-02-17T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.748591 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.748643 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.748654 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.748672 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.748683 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:06Z","lastTransitionTime":"2026-02-17T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.852188 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.852249 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.852266 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.852294 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.852312 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:06Z","lastTransitionTime":"2026-02-17T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.954802 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.954837 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.954848 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.954864 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:06 crc kubenswrapper[4848]: I0217 09:06:06.954875 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:06Z","lastTransitionTime":"2026-02-17T09:06:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.058283 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.058324 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.058334 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.058349 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.058359 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:07Z","lastTransitionTime":"2026-02-17T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.161274 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.161347 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.161371 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.161402 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.161424 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:07Z","lastTransitionTime":"2026-02-17T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.264559 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.264612 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.264628 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.264650 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.264667 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:07Z","lastTransitionTime":"2026-02-17T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.346214 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 05:54:07.04734328 +0000 UTC Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.367808 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.367905 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.367926 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.367952 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.367972 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:07Z","lastTransitionTime":"2026-02-17T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.382675 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.382711 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.382715 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:07 crc kubenswrapper[4848]: E0217 09:06:07.382901 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.382954 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:07 crc kubenswrapper[4848]: E0217 09:06:07.383094 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:07 crc kubenswrapper[4848]: E0217 09:06:07.383309 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:07 crc kubenswrapper[4848]: E0217 09:06:07.383401 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.471194 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.471250 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.471267 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.471294 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.471311 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:07Z","lastTransitionTime":"2026-02-17T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.575060 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.575092 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.575102 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.575114 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.575123 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:07Z","lastTransitionTime":"2026-02-17T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.678460 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.678517 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.678535 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.678559 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.678576 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:07Z","lastTransitionTime":"2026-02-17T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.781088 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.781605 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.781830 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.782260 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.782449 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:07Z","lastTransitionTime":"2026-02-17T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.885732 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.885848 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.885878 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.885904 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.885921 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:07Z","lastTransitionTime":"2026-02-17T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.988498 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.988555 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.988571 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.988597 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:07 crc kubenswrapper[4848]: I0217 09:06:07.988614 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:07Z","lastTransitionTime":"2026-02-17T09:06:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.091641 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.091740 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.091813 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.091852 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.091883 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:08Z","lastTransitionTime":"2026-02-17T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.195216 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.195293 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.195316 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.195348 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.195370 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:08Z","lastTransitionTime":"2026-02-17T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.297557 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.297910 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.298074 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.298221 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.298366 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:08Z","lastTransitionTime":"2026-02-17T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.346326 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 15:12:49.122357004 +0000 UTC Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.401200 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.401260 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.401306 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.401338 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.401365 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:08Z","lastTransitionTime":"2026-02-17T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.504648 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.504708 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.504725 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.504751 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.504844 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:08Z","lastTransitionTime":"2026-02-17T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.607881 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.608021 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.608047 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.608074 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.608095 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:08Z","lastTransitionTime":"2026-02-17T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.711067 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.711160 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.711187 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.711219 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.711242 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:08Z","lastTransitionTime":"2026-02-17T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.814592 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.814679 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.814697 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.814722 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.814741 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:08Z","lastTransitionTime":"2026-02-17T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.917458 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.917526 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.917549 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.917578 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:08 crc kubenswrapper[4848]: I0217 09:06:08.917601 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:08Z","lastTransitionTime":"2026-02-17T09:06:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.020682 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.020748 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.020800 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.020829 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.020850 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:09Z","lastTransitionTime":"2026-02-17T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.124817 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.124863 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.124875 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.124894 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.124904 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:09Z","lastTransitionTime":"2026-02-17T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.228836 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.229279 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.229443 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.229603 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.229752 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:09Z","lastTransitionTime":"2026-02-17T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.333434 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.333497 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.333515 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.333545 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.333564 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:09Z","lastTransitionTime":"2026-02-17T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.346963 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 18:22:09.717064292 +0000 UTC Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.382940 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.383035 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.383084 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:09 crc kubenswrapper[4848]: E0217 09:06:09.383243 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.383266 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:09 crc kubenswrapper[4848]: E0217 09:06:09.383400 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:09 crc kubenswrapper[4848]: E0217 09:06:09.383481 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:09 crc kubenswrapper[4848]: E0217 09:06:09.383651 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.436888 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.436968 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.436991 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.437021 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.437043 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:09Z","lastTransitionTime":"2026-02-17T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.540474 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.540553 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.540576 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.540605 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.540628 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:09Z","lastTransitionTime":"2026-02-17T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.642821 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.642872 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.642892 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.642917 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.642936 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:09Z","lastTransitionTime":"2026-02-17T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.695378 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.695435 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.695452 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.695476 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.695493 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:09Z","lastTransitionTime":"2026-02-17T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:09 crc kubenswrapper[4848]: E0217 09:06:09.715949 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.720934 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.720961 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.720969 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.720982 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.720993 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:09Z","lastTransitionTime":"2026-02-17T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:09 crc kubenswrapper[4848]: E0217 09:06:09.741713 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.747789 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.747816 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.747828 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.747845 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.747857 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:09Z","lastTransitionTime":"2026-02-17T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.770048 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs\") pod \"network-metrics-daemon-78r6x\" (UID: \"98bfddd8-4a1a-4b90-973a-adb75b02fdba\") " pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:09 crc kubenswrapper[4848]: E0217 09:06:09.770275 4848 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 09:06:09 crc kubenswrapper[4848]: E0217 09:06:09.770375 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs podName:98bfddd8-4a1a-4b90-973a-adb75b02fdba nodeName:}" failed. No retries permitted until 2026-02-17 09:06:17.770348819 +0000 UTC m=+55.313604555 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs") pod "network-metrics-daemon-78r6x" (UID: "98bfddd8-4a1a-4b90-973a-adb75b02fdba") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 09:06:09 crc kubenswrapper[4848]: E0217 09:06:09.771711 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.777268 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.777509 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.777668 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.777840 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.778027 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:09Z","lastTransitionTime":"2026-02-17T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:09 crc kubenswrapper[4848]: E0217 09:06:09.798119 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.803576 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.803631 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.803652 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.803696 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.803729 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:09Z","lastTransitionTime":"2026-02-17T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:09 crc kubenswrapper[4848]: E0217 09:06:09.822800 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:09Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:09 crc kubenswrapper[4848]: E0217 09:06:09.823102 4848 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.825600 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.825721 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.825740 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.825793 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.825810 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:09Z","lastTransitionTime":"2026-02-17T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.929097 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.929185 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.929204 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.929229 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:09 crc kubenswrapper[4848]: I0217 09:06:09.929247 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:09Z","lastTransitionTime":"2026-02-17T09:06:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.032047 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.032104 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.032122 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.032145 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.032161 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:10Z","lastTransitionTime":"2026-02-17T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.135580 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.135660 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.135686 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.135718 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.135746 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:10Z","lastTransitionTime":"2026-02-17T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.239112 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.239221 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.239239 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.239264 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.239323 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:10Z","lastTransitionTime":"2026-02-17T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.342213 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.342270 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.342286 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.342311 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.342333 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:10Z","lastTransitionTime":"2026-02-17T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.347858 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 06:20:49.135105777 +0000 UTC Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.445821 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.445912 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.445930 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.445984 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.446003 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:10Z","lastTransitionTime":"2026-02-17T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.548904 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.548955 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.548971 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.548990 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.549003 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:10Z","lastTransitionTime":"2026-02-17T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.652451 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.652528 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.652555 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.652589 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.652615 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:10Z","lastTransitionTime":"2026-02-17T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.755266 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.755321 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.755345 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.755372 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.755390 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:10Z","lastTransitionTime":"2026-02-17T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.859147 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.859198 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.859216 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.859242 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.859258 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:10Z","lastTransitionTime":"2026-02-17T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.962446 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.962508 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.962531 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.962558 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:10 crc kubenswrapper[4848]: I0217 09:06:10.962580 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:10Z","lastTransitionTime":"2026-02-17T09:06:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.066167 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.066227 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.066244 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.066266 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.066283 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:11Z","lastTransitionTime":"2026-02-17T09:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.143321 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.154963 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.163147 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.168786 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.168849 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.168865 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.168888 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.168905 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:11Z","lastTransitionTime":"2026-02-17T09:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.183812 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.199495 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.215390 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.232492 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.247373 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.262200 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.272347 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.272402 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.272418 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.272439 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.272454 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:11Z","lastTransitionTime":"2026-02-17T09:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.284453 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.298412 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.320015 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfd3aa1d845e743861894094c9d5835ee87ee37e05d0a45c6c00339742aa5466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfd3aa1d845e743861894094c9d5835ee87ee37e05d0a45c6c00339742aa5466\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"6298 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 09:05:59.605652 6298 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 09:05:59.605698 6298 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 09:05:59.605746 6298 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 09:05:59.605455 6298 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.605884 6298 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 09:05:59.605117 6298 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 09:05:59.605123 6298 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.604852 6298 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.605062 6298 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.606082 6298 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0217 09:05:59.606302 6298 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4fvgf_openshift-ovn-kubernetes(2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.340939 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a85b1d98b8892e543f98f9282c0894c6404171391a3943a85226186e43b66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.348835 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 19:38:16.465120564 +0000 UTC Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.358642 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.374378 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.375339 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.375470 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.375543 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.375627 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.375691 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:11Z","lastTransitionTime":"2026-02-17T09:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.383017 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.383121 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:11 crc kubenswrapper[4848]: E0217 09:06:11.383276 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.383321 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.383353 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:11 crc kubenswrapper[4848]: E0217 09:06:11.383871 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:11 crc kubenswrapper[4848]: E0217 09:06:11.383989 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:11 crc kubenswrapper[4848]: E0217 09:06:11.384157 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.388018 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.403310 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7de1a57-0b76-454f-bc44-ad6632f90e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e5115c6c6df4b20118349fec582725156c26e2dbc6108060f6712eff903c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eda2256c29f83ccd246bf001fea4d97537c17b395496e4334c79203d7290123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-65rnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.415395 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78r6x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98bfddd8-4a1a-4b90-973a-adb75b02fdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78r6x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:11Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.479088 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.479929 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.479970 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.479996 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.480013 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:11Z","lastTransitionTime":"2026-02-17T09:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.583635 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.583709 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.583731 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.583795 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.583823 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:11Z","lastTransitionTime":"2026-02-17T09:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.687015 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.687097 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.687133 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.687165 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.687189 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:11Z","lastTransitionTime":"2026-02-17T09:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.789557 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.789640 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.789668 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.789698 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.789720 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:11Z","lastTransitionTime":"2026-02-17T09:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.893141 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.893527 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.893553 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.893584 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.893658 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:11Z","lastTransitionTime":"2026-02-17T09:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.997115 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.997172 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.997189 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.997212 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:11 crc kubenswrapper[4848]: I0217 09:06:11.997229 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:11Z","lastTransitionTime":"2026-02-17T09:06:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.099724 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.099831 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.099855 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.099892 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.099915 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:12Z","lastTransitionTime":"2026-02-17T09:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.202818 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.202895 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.202917 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.202945 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.202965 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:12Z","lastTransitionTime":"2026-02-17T09:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.306341 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.306397 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.306414 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.306438 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.306454 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:12Z","lastTransitionTime":"2026-02-17T09:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.349858 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 09:49:52.693190736 +0000 UTC Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.409397 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.409454 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.409473 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.409497 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.409515 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:12Z","lastTransitionTime":"2026-02-17T09:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.512328 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.513051 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.513089 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.513126 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.513152 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:12Z","lastTransitionTime":"2026-02-17T09:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.616048 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.616115 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.616137 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.616168 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.616191 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:12Z","lastTransitionTime":"2026-02-17T09:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.719833 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.719913 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.719937 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.719963 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.719982 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:12Z","lastTransitionTime":"2026-02-17T09:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.823221 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.823297 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.823315 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.823347 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.823364 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:12Z","lastTransitionTime":"2026-02-17T09:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.926508 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.926554 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.926565 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.926582 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:12 crc kubenswrapper[4848]: I0217 09:06:12.926597 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:12Z","lastTransitionTime":"2026-02-17T09:06:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.029568 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.029844 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.029975 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.030061 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.030143 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:13Z","lastTransitionTime":"2026-02-17T09:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.133015 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.133086 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.133104 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.133128 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.133145 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:13Z","lastTransitionTime":"2026-02-17T09:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.237969 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.238191 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.238263 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.238368 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.238448 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:13Z","lastTransitionTime":"2026-02-17T09:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.340881 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.340965 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.340991 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.341023 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.341048 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:13Z","lastTransitionTime":"2026-02-17T09:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.350481 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 22:55:56.010035836 +0000 UTC Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.382858 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.383063 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:13 crc kubenswrapper[4848]: E0217 09:06:13.383286 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.383375 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.383470 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:13 crc kubenswrapper[4848]: E0217 09:06:13.383626 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:13 crc kubenswrapper[4848]: E0217 09:06:13.383813 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:13 crc kubenswrapper[4848]: E0217 09:06:13.384070 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.402126 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.434256 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dfd3aa1d845e743861894094c9d5835ee87ee37e05d0a45c6c00339742aa5466\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfd3aa1d845e743861894094c9d5835ee87ee37e05d0a45c6c00339742aa5466\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"6298 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 09:05:59.605652 6298 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 09:05:59.605698 6298 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 09:05:59.605746 6298 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 09:05:59.605455 6298 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.605884 6298 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 09:05:59.605117 6298 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 09:05:59.605123 6298 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.604852 6298 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.605062 6298 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.606082 6298 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0217 09:05:59.606302 6298 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-4fvgf_openshift-ovn-kubernetes(2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.443818 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.443906 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.443928 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.443953 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.444002 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:13Z","lastTransitionTime":"2026-02-17T09:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.460856 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a85b1d98b8892e543f98f9282c0894c6404171391a3943a85226186e43b66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.477327 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.493056 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.509073 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.527958 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78r6x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98bfddd8-4a1a-4b90-973a-adb75b02fdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78r6x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.546505 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.546568 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.546585 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.546609 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.546627 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:13Z","lastTransitionTime":"2026-02-17T09:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.548707 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.567388 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.583698 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.600909 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7de1a57-0b76-454f-bc44-ad6632f90e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e5115c6c6df4b20118349fec582725156c26e2dbc6108060f6712eff903c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eda2256c29f83ccd246bf001fea4d97537c17b395496e4334c79203d7290123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-65rnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.619439 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.642480 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.649932 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.649977 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.649995 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.650018 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.650035 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:13Z","lastTransitionTime":"2026-02-17T09:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.664810 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06bff994-c885-44d9-bdb7-30b93dde2005\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca3c8c541c67663ceec2edf95439371077c87b4040a0ce7656c8bf72d601cd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eacf65ee572ba8ca73ed4cc5368d381803152a667577f11c4f91a7dc763256a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ea9449101d695309f91bf3b9e7639d6451b3ef988b42658cc43411c83148857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12f6b874bf0e3bb786d81941053c5c37b47f5cdbd2502e8da344de2de288ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12f6b874bf0e3bb786d81941053c5c37b47f5cdbd2502e8da344de2de288ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.689114 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.710083 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.729812 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:13Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.752498 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.752622 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.752729 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.752877 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.753008 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:13Z","lastTransitionTime":"2026-02-17T09:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.856109 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.856177 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.856196 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.856319 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.856378 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:13Z","lastTransitionTime":"2026-02-17T09:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.977971 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.978082 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.978113 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.978141 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:13 crc kubenswrapper[4848]: I0217 09:06:13.978187 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:13Z","lastTransitionTime":"2026-02-17T09:06:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.080969 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.081031 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.081050 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.081074 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.081092 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:14Z","lastTransitionTime":"2026-02-17T09:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.184608 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.184659 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.184683 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.184715 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.184749 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:14Z","lastTransitionTime":"2026-02-17T09:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.287622 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.287795 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.287814 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.287837 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.287854 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:14Z","lastTransitionTime":"2026-02-17T09:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.351514 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 11:49:31.599139332 +0000 UTC Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.390986 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.391053 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.391075 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.391102 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.391125 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:14Z","lastTransitionTime":"2026-02-17T09:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.494694 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.494752 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.494806 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.494832 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.494849 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:14Z","lastTransitionTime":"2026-02-17T09:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.598015 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.598060 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.598075 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.598098 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.598116 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:14Z","lastTransitionTime":"2026-02-17T09:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.701793 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.701854 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.701876 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.701904 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.701926 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:14Z","lastTransitionTime":"2026-02-17T09:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.805022 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.805076 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.805094 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.805118 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.805134 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:14Z","lastTransitionTime":"2026-02-17T09:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.908365 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.908848 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.909054 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.909250 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:14 crc kubenswrapper[4848]: I0217 09:06:14.909462 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:14Z","lastTransitionTime":"2026-02-17T09:06:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.012436 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.012504 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.012523 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.012545 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.012562 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:15Z","lastTransitionTime":"2026-02-17T09:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.115308 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.115559 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.115702 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.115843 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.115972 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:15Z","lastTransitionTime":"2026-02-17T09:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.139352 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:06:15 crc kubenswrapper[4848]: E0217 09:06:15.139621 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:06:47.139588918 +0000 UTC m=+84.682844604 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.140293 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.140644 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.140909 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:15 crc kubenswrapper[4848]: E0217 09:06:15.140584 4848 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.141119 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:15 crc kubenswrapper[4848]: E0217 09:06:15.141259 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 09:06:47.141195002 +0000 UTC m=+84.684450688 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 09:06:15 crc kubenswrapper[4848]: E0217 09:06:15.140965 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 09:06:15 crc kubenswrapper[4848]: E0217 09:06:15.141301 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 09:06:15 crc kubenswrapper[4848]: E0217 09:06:15.141322 4848 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:06:15 crc kubenswrapper[4848]: E0217 09:06:15.141400 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 09:06:47.141375777 +0000 UTC m=+84.684631463 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:06:15 crc kubenswrapper[4848]: E0217 09:06:15.141095 4848 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 09:06:15 crc kubenswrapper[4848]: E0217 09:06:15.141459 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 09:06:47.141443769 +0000 UTC m=+84.684699445 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 09:06:15 crc kubenswrapper[4848]: E0217 09:06:15.142010 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 09:06:15 crc kubenswrapper[4848]: E0217 09:06:15.142156 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 09:06:15 crc kubenswrapper[4848]: E0217 09:06:15.142292 4848 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:06:15 crc kubenswrapper[4848]: E0217 09:06:15.142494 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 09:06:47.142469757 +0000 UTC m=+84.685725443 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.219328 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.219371 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.219382 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.219398 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.219409 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:15Z","lastTransitionTime":"2026-02-17T09:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.322029 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.322099 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.322117 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.322144 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.322163 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:15Z","lastTransitionTime":"2026-02-17T09:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.352522 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 18:42:06.696697629 +0000 UTC Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.383023 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.383052 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:15 crc kubenswrapper[4848]: E0217 09:06:15.383492 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.383168 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:15 crc kubenswrapper[4848]: E0217 09:06:15.383578 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.383155 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:15 crc kubenswrapper[4848]: E0217 09:06:15.383631 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:15 crc kubenswrapper[4848]: E0217 09:06:15.383823 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.425679 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.425738 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.425800 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.425825 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.425843 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:15Z","lastTransitionTime":"2026-02-17T09:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.528014 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.528057 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.528073 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.528097 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.528114 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:15Z","lastTransitionTime":"2026-02-17T09:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.630656 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.631516 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.631669 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.631840 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.632037 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:15Z","lastTransitionTime":"2026-02-17T09:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.734718 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.734825 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.734845 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.734876 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.734893 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:15Z","lastTransitionTime":"2026-02-17T09:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.838554 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.838601 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.838617 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.838645 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.838662 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:15Z","lastTransitionTime":"2026-02-17T09:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.941550 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.942073 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.942296 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.942459 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:15 crc kubenswrapper[4848]: I0217 09:06:15.942589 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:15Z","lastTransitionTime":"2026-02-17T09:06:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.046310 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.046371 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.046389 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.046417 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.046461 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:16Z","lastTransitionTime":"2026-02-17T09:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.149171 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.149220 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.149235 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.149257 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.149275 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:16Z","lastTransitionTime":"2026-02-17T09:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.254197 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.254285 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.254310 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.254341 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.254379 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:16Z","lastTransitionTime":"2026-02-17T09:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.353014 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 11:55:48.10794511 +0000 UTC Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.357335 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.357392 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.357411 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.357435 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.357453 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:16Z","lastTransitionTime":"2026-02-17T09:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.383495 4848 scope.go:117] "RemoveContainer" containerID="dfd3aa1d845e743861894094c9d5835ee87ee37e05d0a45c6c00339742aa5466" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.460447 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.460503 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.460522 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.460549 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.460570 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:16Z","lastTransitionTime":"2026-02-17T09:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.564111 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.564162 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.564185 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.564212 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.564232 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:16Z","lastTransitionTime":"2026-02-17T09:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.668372 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.668442 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.668467 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.668498 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.668520 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:16Z","lastTransitionTime":"2026-02-17T09:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.751262 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4fvgf_2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8/ovnkube-controller/1.log" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.755504 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerStarted","Data":"f68b2b34f072d2a0ec96dc2126a480b5ffbf9b47d693991a47291f26f775547d"} Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.756490 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.772254 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.772305 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.772323 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.772345 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.772363 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:16Z","lastTransitionTime":"2026-02-17T09:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.784488 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a85b1d98b8892e543f98f9282c0894c6404171391a3943a85226186e43b66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.804359 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.828835 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.843678 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.875195 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.875252 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.875263 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.875278 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.875312 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:16Z","lastTransitionTime":"2026-02-17T09:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.878316 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f68b2b34f072d2a0ec96dc2126a480b5ffbf9b47d693991a47291f26f775547d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfd3aa1d845e743861894094c9d5835ee87ee37e05d0a45c6c00339742aa5466\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"6298 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 09:05:59.605652 6298 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 09:05:59.605698 6298 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 09:05:59.605746 6298 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 09:05:59.605455 6298 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.605884 6298 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 09:05:59.605117 6298 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 09:05:59.605123 6298 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.604852 6298 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.605062 6298 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.606082 6298 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0217 09:05:59.606302 6298 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.906479 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.925719 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.942326 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.958082 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7de1a57-0b76-454f-bc44-ad6632f90e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e5115c6c6df4b20118349fec582725156c26e2dbc6108060f6712eff903c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eda2256c29f83ccd246bf001fea4d97537c17b395496e4334c79203d7290123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-65rnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.975438 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78r6x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98bfddd8-4a1a-4b90-973a-adb75b02fdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78r6x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.977581 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.977619 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.977629 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.977645 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.977656 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:16Z","lastTransitionTime":"2026-02-17T09:06:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:16 crc kubenswrapper[4848]: I0217 09:06:16.991982 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:16Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.006270 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06bff994-c885-44d9-bdb7-30b93dde2005\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca3c8c541c67663ceec2edf95439371077c87b4040a0ce7656c8bf72d601cd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eacf65ee572ba8ca73ed4cc5368d381803152a667577f11c4f91a7dc763256a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ea9449101d695309f91bf3b9e7639d6451b3ef988b42658cc43411c83148857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12f6b874bf0e3bb786d81941053c5c37b47f5cdbd2502e8da344de2de288ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12f6b874bf0e3bb786d81941053c5c37b47f5cdbd2502e8da344de2de288ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.019739 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.032317 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.042647 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.053685 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.063795 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.079649 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.079677 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.079687 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.079703 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.079715 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:17Z","lastTransitionTime":"2026-02-17T09:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.182390 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.182435 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.182448 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.182466 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.182479 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:17Z","lastTransitionTime":"2026-02-17T09:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.284616 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.284667 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.284687 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.284711 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.284728 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:17Z","lastTransitionTime":"2026-02-17T09:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.353359 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 03:41:35.601752639 +0000 UTC Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.383038 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:17 crc kubenswrapper[4848]: E0217 09:06:17.383238 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.383106 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:17 crc kubenswrapper[4848]: E0217 09:06:17.383494 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.383061 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:17 crc kubenswrapper[4848]: E0217 09:06:17.383670 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.383292 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:17 crc kubenswrapper[4848]: E0217 09:06:17.383889 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.387140 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.387290 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.387380 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.387453 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.387521 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:17Z","lastTransitionTime":"2026-02-17T09:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.489819 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.489888 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.489923 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.489953 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.489975 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:17Z","lastTransitionTime":"2026-02-17T09:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.592821 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.592865 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.592883 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.592905 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.592921 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:17Z","lastTransitionTime":"2026-02-17T09:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.695422 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.696694 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.696850 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.696975 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.697122 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:17Z","lastTransitionTime":"2026-02-17T09:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.760731 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4fvgf_2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8/ovnkube-controller/2.log" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.761836 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4fvgf_2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8/ovnkube-controller/1.log" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.765572 4848 generic.go:334] "Generic (PLEG): container finished" podID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerID="f68b2b34f072d2a0ec96dc2126a480b5ffbf9b47d693991a47291f26f775547d" exitCode=1 Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.765620 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerDied","Data":"f68b2b34f072d2a0ec96dc2126a480b5ffbf9b47d693991a47291f26f775547d"} Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.765663 4848 scope.go:117] "RemoveContainer" containerID="dfd3aa1d845e743861894094c9d5835ee87ee37e05d0a45c6c00339742aa5466" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.766699 4848 scope.go:117] "RemoveContainer" containerID="f68b2b34f072d2a0ec96dc2126a480b5ffbf9b47d693991a47291f26f775547d" Feb 17 09:06:17 crc kubenswrapper[4848]: E0217 09:06:17.766955 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4fvgf_openshift-ovn-kubernetes(2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.792293 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.800098 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.800497 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.800709 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.800883 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.801002 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:17Z","lastTransitionTime":"2026-02-17T09:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.806537 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.819297 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.836427 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.857003 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.871722 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs\") pod \"network-metrics-daemon-78r6x\" (UID: \"98bfddd8-4a1a-4b90-973a-adb75b02fdba\") " pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:17 crc kubenswrapper[4848]: E0217 09:06:17.872433 4848 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 09:06:17 crc kubenswrapper[4848]: E0217 09:06:17.872533 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs podName:98bfddd8-4a1a-4b90-973a-adb75b02fdba nodeName:}" failed. No retries permitted until 2026-02-17 09:06:33.872500658 +0000 UTC m=+71.415756374 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs") pod "network-metrics-daemon-78r6x" (UID: "98bfddd8-4a1a-4b90-973a-adb75b02fdba") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.875421 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06bff994-c885-44d9-bdb7-30b93dde2005\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca3c8c541c67663ceec2edf95439371077c87b4040a0ce7656c8bf72d601cd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eacf65ee572ba8ca73ed4cc5368d381803152a667577f11c4f91a7dc763256a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ea9449101d695309f91bf3b9e7639d6451b3ef988b42658cc43411c83148857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12f6b874bf0e3bb786d81941053c5c37b47f5cdbd2502e8da344de2de288ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12f6b874bf0e3bb786d81941053c5c37b47f5cdbd2502e8da344de2de288ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.888301 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.904035 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.904198 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.904281 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.904393 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.904568 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:17Z","lastTransitionTime":"2026-02-17T09:06:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.906696 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.921413 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.934841 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.957025 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f68b2b34f072d2a0ec96dc2126a480b5ffbf9b47d693991a47291f26f775547d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfd3aa1d845e743861894094c9d5835ee87ee37e05d0a45c6c00339742aa5466\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:05:59Z\\\",\\\"message\\\":\\\"6298 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 09:05:59.605652 6298 handler.go:208] Removed *v1.Node event handler 7\\\\nI0217 09:05:59.605698 6298 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 09:05:59.605746 6298 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 09:05:59.605455 6298 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.605884 6298 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 09:05:59.605117 6298 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 09:05:59.605123 6298 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.604852 6298 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.605062 6298 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0217 09:05:59.606082 6298 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0217 09:05:59.606302 6298 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68b2b34f072d2a0ec96dc2126a480b5ffbf9b47d693991a47291f26f775547d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:06:17Z\\\",\\\"message\\\":\\\".361707 6524 services_controller.go:451] Built service openshift-etcd/etcd cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 09:06:17.361795 6524 services_controller.go:452] Built service openshift-etcd/etcd per-node LB for network=default: []services.LB{}\\\\nI0217 09:06:17.361812 6524 services_controller.go:453] Built service openshift-etcd/etcd template LB for network=default: []services.LB{}\\\\nI0217 09:06:17.361828 6524 services_controller.go:454] Service openshift-etcd/etcd for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0217 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:06:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.976874 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a85b1d98b8892e543f98f9282c0894c6404171391a3943a85226186e43b66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:17 crc kubenswrapper[4848]: I0217 09:06:17.990783 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:17Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.003869 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.007257 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.007292 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.007300 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.007314 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.007328 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:18Z","lastTransitionTime":"2026-02-17T09:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.018278 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7de1a57-0b76-454f-bc44-ad6632f90e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e5115c6c6df4b20118349fec582725156c26e2dbc6108060f6712eff903c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eda2256c29f83ccd246bf001fea4d97537c17b395496e4334c79203d7290123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-65rnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.032190 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78r6x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98bfddd8-4a1a-4b90-973a-adb75b02fdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78r6x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.048334 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.109946 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.110005 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.110021 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.110043 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.110057 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:18Z","lastTransitionTime":"2026-02-17T09:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.213202 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.213361 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.213391 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.213430 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.213458 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:18Z","lastTransitionTime":"2026-02-17T09:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.316790 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.317104 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.317236 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.317380 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.317523 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:18Z","lastTransitionTime":"2026-02-17T09:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.354093 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 08:53:19.817154616 +0000 UTC Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.422154 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.422230 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.422254 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.422282 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.422300 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:18Z","lastTransitionTime":"2026-02-17T09:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.527420 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.527518 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.527532 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.527558 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.527579 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:18Z","lastTransitionTime":"2026-02-17T09:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.631004 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.631069 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.631091 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.631122 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.631152 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:18Z","lastTransitionTime":"2026-02-17T09:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.735414 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.735489 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.735516 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.735547 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.735570 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:18Z","lastTransitionTime":"2026-02-17T09:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.773221 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4fvgf_2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8/ovnkube-controller/2.log" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.778331 4848 scope.go:117] "RemoveContainer" containerID="f68b2b34f072d2a0ec96dc2126a480b5ffbf9b47d693991a47291f26f775547d" Feb 17 09:06:18 crc kubenswrapper[4848]: E0217 09:06:18.778544 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4fvgf_openshift-ovn-kubernetes(2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.807723 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a85b1d98b8892e543f98f9282c0894c6404171391a3943a85226186e43b66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.828975 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.840254 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.840316 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.840340 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.840369 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.840388 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:18Z","lastTransitionTime":"2026-02-17T09:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.846093 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.862275 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.884689 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f68b2b34f072d2a0ec96dc2126a480b5ffbf9b47d693991a47291f26f775547d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68b2b34f072d2a0ec96dc2126a480b5ffbf9b47d693991a47291f26f775547d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:06:17Z\\\",\\\"message\\\":\\\".361707 6524 services_controller.go:451] Built service openshift-etcd/etcd cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 09:06:17.361795 6524 services_controller.go:452] Built service openshift-etcd/etcd per-node LB for network=default: []services.LB{}\\\\nI0217 09:06:17.361812 6524 services_controller.go:453] Built service openshift-etcd/etcd template LB for network=default: []services.LB{}\\\\nI0217 09:06:17.361828 6524 services_controller.go:454] Service openshift-etcd/etcd for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0217 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:06:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4fvgf_openshift-ovn-kubernetes(2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.904072 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.936379 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.953109 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.953201 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.953219 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.953242 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.953259 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:18Z","lastTransitionTime":"2026-02-17T09:06:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.960142 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.978939 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7de1a57-0b76-454f-bc44-ad6632f90e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e5115c6c6df4b20118349fec582725156c26e2dbc6108060f6712eff903c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eda2256c29f83ccd246bf001fea4d97537c17b395496e4334c79203d7290123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-65rnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:18 crc kubenswrapper[4848]: I0217 09:06:18.989736 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78r6x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98bfddd8-4a1a-4b90-973a-adb75b02fdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78r6x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:18Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.003263 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.018206 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06bff994-c885-44d9-bdb7-30b93dde2005\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca3c8c541c67663ceec2edf95439371077c87b4040a0ce7656c8bf72d601cd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eacf65ee572ba8ca73ed4cc5368d381803152a667577f11c4f91a7dc763256a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ea9449101d695309f91bf3b9e7639d6451b3ef988b42658cc43411c83148857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12f6b874bf0e3bb786d81941053c5c37b47f5cdbd2502e8da344de2de288ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12f6b874bf0e3bb786d81941053c5c37b47f5cdbd2502e8da344de2de288ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.031048 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.048680 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.055532 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.055576 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.055587 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.055604 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.055617 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:19Z","lastTransitionTime":"2026-02-17T09:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.064699 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.084397 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.095679 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.158658 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.158802 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.158823 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.158845 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.158861 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:19Z","lastTransitionTime":"2026-02-17T09:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.261752 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.261857 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.261882 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.261913 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.261936 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:19Z","lastTransitionTime":"2026-02-17T09:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.355112 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 11:19:42.233181702 +0000 UTC Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.363750 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.363788 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.363800 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.363815 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.363826 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:19Z","lastTransitionTime":"2026-02-17T09:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.382678 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.382721 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.382720 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.382848 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:19 crc kubenswrapper[4848]: E0217 09:06:19.382937 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:19 crc kubenswrapper[4848]: E0217 09:06:19.383016 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:19 crc kubenswrapper[4848]: E0217 09:06:19.383086 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:19 crc kubenswrapper[4848]: E0217 09:06:19.383189 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.466671 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.466719 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.466737 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.466816 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.466834 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:19Z","lastTransitionTime":"2026-02-17T09:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.570123 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.570168 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.570187 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.570208 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.570224 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:19Z","lastTransitionTime":"2026-02-17T09:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.673590 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.673637 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.673653 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.673678 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.673695 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:19Z","lastTransitionTime":"2026-02-17T09:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.776822 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.776895 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.776913 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.776936 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.776954 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:19Z","lastTransitionTime":"2026-02-17T09:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.883060 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.884917 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.884971 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.885003 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.885026 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:19Z","lastTransitionTime":"2026-02-17T09:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.913901 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.913960 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.913977 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.914000 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.914017 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:19Z","lastTransitionTime":"2026-02-17T09:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:19 crc kubenswrapper[4848]: E0217 09:06:19.929937 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.934644 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.934785 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.934817 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.934847 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.934870 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:19Z","lastTransitionTime":"2026-02-17T09:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:19 crc kubenswrapper[4848]: E0217 09:06:19.955961 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.960589 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.960661 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.960674 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.960713 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.960728 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:19Z","lastTransitionTime":"2026-02-17T09:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:19 crc kubenswrapper[4848]: E0217 09:06:19.985428 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:19Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.992900 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.992939 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.992949 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.992965 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:19 crc kubenswrapper[4848]: I0217 09:06:19.992978 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:19Z","lastTransitionTime":"2026-02-17T09:06:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:20 crc kubenswrapper[4848]: E0217 09:06:20.012710 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:20Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.017178 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.017234 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.017252 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.017275 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.017292 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:20Z","lastTransitionTime":"2026-02-17T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:20 crc kubenswrapper[4848]: E0217 09:06:20.034836 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:20Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:20 crc kubenswrapper[4848]: E0217 09:06:20.035049 4848 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.037274 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.037326 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.037337 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.037355 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.037366 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:20Z","lastTransitionTime":"2026-02-17T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.140660 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.140720 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.140738 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.140793 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.140813 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:20Z","lastTransitionTime":"2026-02-17T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.243524 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.243589 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.243606 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.243632 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.243652 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:20Z","lastTransitionTime":"2026-02-17T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.346710 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.347060 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.347120 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.347144 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.347660 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:20Z","lastTransitionTime":"2026-02-17T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.356215 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 15:02:13.726419529 +0000 UTC Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.450598 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.450693 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.450717 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.450744 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.450800 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:20Z","lastTransitionTime":"2026-02-17T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.553616 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.553691 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.553708 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.553734 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.553752 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:20Z","lastTransitionTime":"2026-02-17T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.656725 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.656879 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.656897 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.656921 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.656940 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:20Z","lastTransitionTime":"2026-02-17T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.759868 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.759963 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.759981 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.760005 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.760023 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:20Z","lastTransitionTime":"2026-02-17T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.863253 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.863404 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.863429 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.863470 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.863487 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:20Z","lastTransitionTime":"2026-02-17T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.966124 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.966160 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.966170 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.966186 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:20 crc kubenswrapper[4848]: I0217 09:06:20.966198 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:20Z","lastTransitionTime":"2026-02-17T09:06:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.068232 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.068345 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.068357 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.068372 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.068381 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:21Z","lastTransitionTime":"2026-02-17T09:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.170217 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.170254 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.170263 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.170276 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.170287 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:21Z","lastTransitionTime":"2026-02-17T09:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.273102 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.273155 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.273171 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.273192 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.273209 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:21Z","lastTransitionTime":"2026-02-17T09:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.357245 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 08:24:15.07065027 +0000 UTC Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.375441 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.375495 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.375514 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.375534 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.375550 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:21Z","lastTransitionTime":"2026-02-17T09:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.383237 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.383307 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.383269 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.383361 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:21 crc kubenswrapper[4848]: E0217 09:06:21.383420 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:21 crc kubenswrapper[4848]: E0217 09:06:21.383545 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:21 crc kubenswrapper[4848]: E0217 09:06:21.383660 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:21 crc kubenswrapper[4848]: E0217 09:06:21.383732 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.479028 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.479098 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.479116 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.479193 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.479220 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:21Z","lastTransitionTime":"2026-02-17T09:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.582508 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.582565 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.582581 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.582605 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.582621 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:21Z","lastTransitionTime":"2026-02-17T09:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.686164 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.686231 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.686302 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.686588 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.686610 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:21Z","lastTransitionTime":"2026-02-17T09:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.788960 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.789012 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.789028 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.789049 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.789066 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:21Z","lastTransitionTime":"2026-02-17T09:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.892104 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.892171 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.892191 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.892216 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.892234 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:21Z","lastTransitionTime":"2026-02-17T09:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.995541 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.995610 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.995634 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.995665 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:21 crc kubenswrapper[4848]: I0217 09:06:21.995687 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:21Z","lastTransitionTime":"2026-02-17T09:06:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.098282 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.098347 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.098369 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.098398 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.098421 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:22Z","lastTransitionTime":"2026-02-17T09:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.201499 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.201888 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.202042 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.202229 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.202407 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:22Z","lastTransitionTime":"2026-02-17T09:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.305326 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.305418 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.305444 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.305474 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.305499 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:22Z","lastTransitionTime":"2026-02-17T09:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.358051 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 01:28:20.572351515 +0000 UTC Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.409089 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.409980 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.410101 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.410142 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.410169 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:22Z","lastTransitionTime":"2026-02-17T09:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.512718 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.512794 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.512812 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.512835 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.512871 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:22Z","lastTransitionTime":"2026-02-17T09:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.615301 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.615349 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.615366 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.615395 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.615412 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:22Z","lastTransitionTime":"2026-02-17T09:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.729313 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.729373 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.729399 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.729429 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.729448 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:22Z","lastTransitionTime":"2026-02-17T09:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.831910 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.831973 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.831990 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.832015 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.832034 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:22Z","lastTransitionTime":"2026-02-17T09:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.934928 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.935002 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.935025 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.935047 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:22 crc kubenswrapper[4848]: I0217 09:06:22.935063 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:22Z","lastTransitionTime":"2026-02-17T09:06:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.037943 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.037991 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.038007 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.038029 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.038045 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:23Z","lastTransitionTime":"2026-02-17T09:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.140490 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.140522 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.140530 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.140542 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.140551 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:23Z","lastTransitionTime":"2026-02-17T09:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.243058 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.243092 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.243104 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.243122 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.243133 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:23Z","lastTransitionTime":"2026-02-17T09:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.344979 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.345023 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.345034 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.345049 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.345065 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:23Z","lastTransitionTime":"2026-02-17T09:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.358366 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 23:18:56.528783616 +0000 UTC Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.382360 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:23 crc kubenswrapper[4848]: E0217 09:06:23.383245 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.383379 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.383405 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:23 crc kubenswrapper[4848]: E0217 09:06:23.383448 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.383640 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:23 crc kubenswrapper[4848]: E0217 09:06:23.383696 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:23 crc kubenswrapper[4848]: E0217 09:06:23.383734 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.403907 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.418504 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06bff994-c885-44d9-bdb7-30b93dde2005\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca3c8c541c67663ceec2edf95439371077c87b4040a0ce7656c8bf72d601cd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eacf65ee572ba8ca73ed4cc5368d381803152a667577f11c4f91a7dc763256a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ea9449101d695309f91bf3b9e7639d6451b3ef988b42658cc43411c83148857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12f6b874bf0e3bb786d81941053c5c37b47f5cdbd2502e8da344de2de288ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12f6b874bf0e3bb786d81941053c5c37b47f5cdbd2502e8da344de2de288ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.440487 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.447797 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.447968 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.448073 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.448188 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.448290 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:23Z","lastTransitionTime":"2026-02-17T09:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.456157 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.477678 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.495693 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.509695 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.531973 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a85b1d98b8892e543f98f9282c0894c6404171391a3943a85226186e43b66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.548033 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.550422 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.550443 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.550451 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.550466 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.550476 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:23Z","lastTransitionTime":"2026-02-17T09:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.567616 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.578729 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.598603 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f68b2b34f072d2a0ec96dc2126a480b5ffbf9b47d693991a47291f26f775547d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68b2b34f072d2a0ec96dc2126a480b5ffbf9b47d693991a47291f26f775547d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:06:17Z\\\",\\\"message\\\":\\\".361707 6524 services_controller.go:451] Built service openshift-etcd/etcd cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 09:06:17.361795 6524 services_controller.go:452] Built service openshift-etcd/etcd per-node LB for network=default: []services.LB{}\\\\nI0217 09:06:17.361812 6524 services_controller.go:453] Built service openshift-etcd/etcd template LB for network=default: []services.LB{}\\\\nI0217 09:06:17.361828 6524 services_controller.go:454] Service openshift-etcd/etcd for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0217 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:06:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4fvgf_openshift-ovn-kubernetes(2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.619039 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.637099 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.653671 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.653716 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.653732 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.653756 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.653807 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:23Z","lastTransitionTime":"2026-02-17T09:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.654122 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.670177 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7de1a57-0b76-454f-bc44-ad6632f90e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e5115c6c6df4b20118349fec582725156c26e2dbc6108060f6712eff903c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eda2256c29f83ccd246bf001fea4d97537c17b395496e4334c79203d7290123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-65rnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.687056 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78r6x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98bfddd8-4a1a-4b90-973a-adb75b02fdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78r6x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:23Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.756324 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.756374 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.756385 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.756401 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.756412 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:23Z","lastTransitionTime":"2026-02-17T09:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.859069 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.859131 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.859149 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.859172 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.859193 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:23Z","lastTransitionTime":"2026-02-17T09:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.962299 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.962366 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.962392 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.962424 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:23 crc kubenswrapper[4848]: I0217 09:06:23.962447 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:23Z","lastTransitionTime":"2026-02-17T09:06:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.065412 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.065473 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.065490 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.065517 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.065534 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:24Z","lastTransitionTime":"2026-02-17T09:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.169123 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.169204 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.169235 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.169266 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.169286 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:24Z","lastTransitionTime":"2026-02-17T09:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.272068 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.272127 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.272144 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.272167 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.272189 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:24Z","lastTransitionTime":"2026-02-17T09:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.359527 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 19:37:25.045521972 +0000 UTC Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.374965 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.375018 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.375034 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.375058 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.375075 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:24Z","lastTransitionTime":"2026-02-17T09:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.478487 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.478548 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.478569 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.478593 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.478611 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:24Z","lastTransitionTime":"2026-02-17T09:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.582104 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.582183 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.582203 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.582229 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.582247 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:24Z","lastTransitionTime":"2026-02-17T09:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.685128 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.685202 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.685224 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.685308 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.685340 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:24Z","lastTransitionTime":"2026-02-17T09:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.789168 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.789396 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.789412 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.789436 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.789456 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:24Z","lastTransitionTime":"2026-02-17T09:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.893280 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.893348 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.893365 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.893388 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.893405 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:24Z","lastTransitionTime":"2026-02-17T09:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.997076 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.997139 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.997157 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.997182 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:24 crc kubenswrapper[4848]: I0217 09:06:24.997201 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:24Z","lastTransitionTime":"2026-02-17T09:06:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.099386 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.099427 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.099438 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.099454 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.099468 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:25Z","lastTransitionTime":"2026-02-17T09:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.203007 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.203081 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.203107 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.203139 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.203164 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:25Z","lastTransitionTime":"2026-02-17T09:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.306664 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.306725 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.306746 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.306798 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.306818 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:25Z","lastTransitionTime":"2026-02-17T09:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.360651 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 10:03:46.884124705 +0000 UTC Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.382915 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.382915 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.383073 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.383167 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:25 crc kubenswrapper[4848]: E0217 09:06:25.383153 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:25 crc kubenswrapper[4848]: E0217 09:06:25.383345 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:25 crc kubenswrapper[4848]: E0217 09:06:25.383440 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:25 crc kubenswrapper[4848]: E0217 09:06:25.383577 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.408579 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.408637 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.408660 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.408688 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.408715 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:25Z","lastTransitionTime":"2026-02-17T09:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.511618 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.511677 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.511693 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.511740 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.511783 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:25Z","lastTransitionTime":"2026-02-17T09:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.615703 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.615803 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.615828 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.615855 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.615876 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:25Z","lastTransitionTime":"2026-02-17T09:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.718284 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.718345 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.718364 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.718388 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.718430 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:25Z","lastTransitionTime":"2026-02-17T09:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.821086 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.821153 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.821172 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.821195 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.821216 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:25Z","lastTransitionTime":"2026-02-17T09:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.924267 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.924323 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.924337 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.924355 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:25 crc kubenswrapper[4848]: I0217 09:06:25.924367 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:25Z","lastTransitionTime":"2026-02-17T09:06:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.027157 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.027196 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.027204 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.027217 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.027228 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:26Z","lastTransitionTime":"2026-02-17T09:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.129876 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.129922 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.129934 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.129950 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.129978 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:26Z","lastTransitionTime":"2026-02-17T09:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.232544 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.232610 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.232628 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.232652 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.232669 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:26Z","lastTransitionTime":"2026-02-17T09:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.335598 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.335666 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.335684 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.335709 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.335728 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:26Z","lastTransitionTime":"2026-02-17T09:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.361158 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 14:26:53.410762118 +0000 UTC Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.438194 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.438227 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.438237 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.438251 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.438263 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:26Z","lastTransitionTime":"2026-02-17T09:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.541503 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.541562 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.541572 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.541586 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.541597 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:26Z","lastTransitionTime":"2026-02-17T09:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.644143 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.644222 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.644234 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.644251 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.644266 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:26Z","lastTransitionTime":"2026-02-17T09:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.747041 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.747086 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.747121 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.747140 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.747153 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:26Z","lastTransitionTime":"2026-02-17T09:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.849997 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.850063 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.850080 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.850103 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.850121 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:26Z","lastTransitionTime":"2026-02-17T09:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.953753 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.953862 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.953884 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.953913 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:26 crc kubenswrapper[4848]: I0217 09:06:26.953936 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:26Z","lastTransitionTime":"2026-02-17T09:06:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.057401 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.057503 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.057521 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.057578 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.057610 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:27Z","lastTransitionTime":"2026-02-17T09:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.161105 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.161264 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.161282 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.161308 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.161326 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:27Z","lastTransitionTime":"2026-02-17T09:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.268422 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.268503 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.268531 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.268578 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.268606 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:27Z","lastTransitionTime":"2026-02-17T09:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.361696 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 19:05:29.897182524 +0000 UTC Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.373030 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.373075 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.373087 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.373104 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.373116 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:27Z","lastTransitionTime":"2026-02-17T09:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.412223 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.412287 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:27 crc kubenswrapper[4848]: E0217 09:06:27.412458 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.412244 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:27 crc kubenswrapper[4848]: E0217 09:06:27.412737 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:27 crc kubenswrapper[4848]: E0217 09:06:27.413072 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.413123 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:27 crc kubenswrapper[4848]: E0217 09:06:27.413168 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.476665 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.476714 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.476726 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.476741 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.476753 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:27Z","lastTransitionTime":"2026-02-17T09:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.580274 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.580314 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.580326 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.580341 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.580352 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:27Z","lastTransitionTime":"2026-02-17T09:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.682611 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.682649 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.682658 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.682671 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.682681 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:27Z","lastTransitionTime":"2026-02-17T09:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.785720 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.785809 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.785828 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.785853 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.785871 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:27Z","lastTransitionTime":"2026-02-17T09:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.888035 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.888111 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.888134 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.888502 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.888553 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:27Z","lastTransitionTime":"2026-02-17T09:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.990621 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.990690 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.990707 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.990730 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:27 crc kubenswrapper[4848]: I0217 09:06:27.990748 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:27Z","lastTransitionTime":"2026-02-17T09:06:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.092535 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.092572 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.092582 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.092598 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.092606 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:28Z","lastTransitionTime":"2026-02-17T09:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.195644 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.195693 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.195703 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.195717 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.195729 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:28Z","lastTransitionTime":"2026-02-17T09:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.298799 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.299147 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.299168 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.299194 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.299211 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:28Z","lastTransitionTime":"2026-02-17T09:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.362414 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 20:54:30.314271888 +0000 UTC Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.401686 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.401743 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.401784 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.401808 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.401826 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:28Z","lastTransitionTime":"2026-02-17T09:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.503863 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.503905 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.503914 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.503926 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.503935 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:28Z","lastTransitionTime":"2026-02-17T09:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.606400 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.606462 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.606479 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.606503 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.606521 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:28Z","lastTransitionTime":"2026-02-17T09:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.709654 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.709695 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.709705 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.709721 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.709734 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:28Z","lastTransitionTime":"2026-02-17T09:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.811746 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.811798 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.811809 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.811823 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.811833 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:28Z","lastTransitionTime":"2026-02-17T09:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.914525 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.914580 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.914594 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.914614 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:28 crc kubenswrapper[4848]: I0217 09:06:28.914632 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:28Z","lastTransitionTime":"2026-02-17T09:06:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.017037 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.017081 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.017098 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.017122 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.017136 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:29Z","lastTransitionTime":"2026-02-17T09:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.119481 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.119517 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.119529 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.119545 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.119557 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:29Z","lastTransitionTime":"2026-02-17T09:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.221913 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.221988 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.222013 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.222039 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.222059 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:29Z","lastTransitionTime":"2026-02-17T09:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.325266 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.325329 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.325341 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.325363 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.325377 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:29Z","lastTransitionTime":"2026-02-17T09:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.362815 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 18:29:20.629325166 +0000 UTC Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.383987 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.384018 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.384214 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:29 crc kubenswrapper[4848]: E0217 09:06:29.384179 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:29 crc kubenswrapper[4848]: E0217 09:06:29.384355 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:29 crc kubenswrapper[4848]: E0217 09:06:29.384437 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.385212 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:29 crc kubenswrapper[4848]: E0217 09:06:29.385438 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.428803 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.428863 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.428882 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.428907 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.428925 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:29Z","lastTransitionTime":"2026-02-17T09:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.531394 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.531452 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.531469 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.531492 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.531509 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:29Z","lastTransitionTime":"2026-02-17T09:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.633685 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.633749 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.633801 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.633826 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.633845 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:29Z","lastTransitionTime":"2026-02-17T09:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.736264 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.736305 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.736314 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.736328 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.736338 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:29Z","lastTransitionTime":"2026-02-17T09:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.837910 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.837939 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.837948 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.837960 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.837969 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:29Z","lastTransitionTime":"2026-02-17T09:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.940874 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.940931 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.940946 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.940990 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:29 crc kubenswrapper[4848]: I0217 09:06:29.941006 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:29Z","lastTransitionTime":"2026-02-17T09:06:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.043702 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.043727 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.043735 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.043747 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.043769 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:30Z","lastTransitionTime":"2026-02-17T09:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.146310 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.146363 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.146381 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.146404 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.146422 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:30Z","lastTransitionTime":"2026-02-17T09:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.241201 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.241974 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.242002 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.242019 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.242031 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:30Z","lastTransitionTime":"2026-02-17T09:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:30 crc kubenswrapper[4848]: E0217 09:06:30.258634 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.263529 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.263600 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.263629 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.263661 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.263683 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:30Z","lastTransitionTime":"2026-02-17T09:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:30 crc kubenswrapper[4848]: E0217 09:06:30.280331 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.284355 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.284388 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.284397 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.284412 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.284422 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:30Z","lastTransitionTime":"2026-02-17T09:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:30 crc kubenswrapper[4848]: E0217 09:06:30.297433 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.301728 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.301881 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.302020 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.302051 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.302068 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:30Z","lastTransitionTime":"2026-02-17T09:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:30 crc kubenswrapper[4848]: E0217 09:06:30.322142 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.327201 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.327244 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.327253 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.327286 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.327297 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:30Z","lastTransitionTime":"2026-02-17T09:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:30 crc kubenswrapper[4848]: E0217 09:06:30.341660 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:30Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:30 crc kubenswrapper[4848]: E0217 09:06:30.341814 4848 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.343806 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.343830 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.343837 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.343850 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.343859 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:30Z","lastTransitionTime":"2026-02-17T09:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.363322 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 18:59:38.2902548 +0000 UTC Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.446432 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.446518 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.446545 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.446580 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.446606 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:30Z","lastTransitionTime":"2026-02-17T09:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.549603 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.549876 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.549946 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.550023 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.550095 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:30Z","lastTransitionTime":"2026-02-17T09:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.652886 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.652967 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.652985 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.653011 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.653028 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:30Z","lastTransitionTime":"2026-02-17T09:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.755486 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.755513 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.755523 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.755534 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.755544 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:30Z","lastTransitionTime":"2026-02-17T09:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.858233 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.858284 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.858305 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.858331 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.858352 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:30Z","lastTransitionTime":"2026-02-17T09:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.960825 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.960882 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.960930 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.960954 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:30 crc kubenswrapper[4848]: I0217 09:06:30.960971 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:30Z","lastTransitionTime":"2026-02-17T09:06:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.063737 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.063818 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.063837 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.063861 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.063877 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:31Z","lastTransitionTime":"2026-02-17T09:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.167083 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.167138 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.167156 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.167179 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.167197 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:31Z","lastTransitionTime":"2026-02-17T09:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.269977 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.270048 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.270072 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.270103 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.270131 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:31Z","lastTransitionTime":"2026-02-17T09:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.364404 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 02:03:20.068000009 +0000 UTC Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.375367 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.375457 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.375488 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.375522 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.375559 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:31Z","lastTransitionTime":"2026-02-17T09:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.382439 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:31 crc kubenswrapper[4848]: E0217 09:06:31.382599 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.382914 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:31 crc kubenswrapper[4848]: E0217 09:06:31.383217 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.383634 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:31 crc kubenswrapper[4848]: E0217 09:06:31.383724 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.383843 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:31 crc kubenswrapper[4848]: E0217 09:06:31.384002 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.479304 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.479356 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.479366 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.479383 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.479393 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:31Z","lastTransitionTime":"2026-02-17T09:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.582466 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.582524 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.582533 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.582548 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.582556 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:31Z","lastTransitionTime":"2026-02-17T09:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.685814 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.685863 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.685873 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.685888 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.685897 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:31Z","lastTransitionTime":"2026-02-17T09:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.788464 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.788501 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.788511 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.788527 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.788539 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:31Z","lastTransitionTime":"2026-02-17T09:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.892438 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.892503 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.892526 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.892557 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.892581 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:31Z","lastTransitionTime":"2026-02-17T09:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.995420 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.995483 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.995498 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.995514 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:31 crc kubenswrapper[4848]: I0217 09:06:31.995526 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:31Z","lastTransitionTime":"2026-02-17T09:06:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.098092 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.098146 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.098162 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.098184 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.098199 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:32Z","lastTransitionTime":"2026-02-17T09:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.200350 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.200391 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.200403 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.200422 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.200434 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:32Z","lastTransitionTime":"2026-02-17T09:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.302586 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.302642 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.302653 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.302669 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.302680 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:32Z","lastTransitionTime":"2026-02-17T09:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.365562 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 17:21:03.685801029 +0000 UTC Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.384090 4848 scope.go:117] "RemoveContainer" containerID="f68b2b34f072d2a0ec96dc2126a480b5ffbf9b47d693991a47291f26f775547d" Feb 17 09:06:32 crc kubenswrapper[4848]: E0217 09:06:32.384426 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4fvgf_openshift-ovn-kubernetes(2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.405940 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.405999 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.406024 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.406052 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.406073 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:32Z","lastTransitionTime":"2026-02-17T09:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.508639 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.508687 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.508698 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.508716 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.508727 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:32Z","lastTransitionTime":"2026-02-17T09:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.611234 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.611269 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.611280 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.611295 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.611306 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:32Z","lastTransitionTime":"2026-02-17T09:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.713738 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.713795 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.713806 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.713821 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.713833 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:32Z","lastTransitionTime":"2026-02-17T09:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.816825 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.816877 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.816893 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.816914 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.816930 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:32Z","lastTransitionTime":"2026-02-17T09:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.918676 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.918710 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.918720 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.918732 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:32 crc kubenswrapper[4848]: I0217 09:06:32.918740 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:32Z","lastTransitionTime":"2026-02-17T09:06:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.020721 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.020753 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.020782 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.020796 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.020807 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:33Z","lastTransitionTime":"2026-02-17T09:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.123717 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.123786 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.123800 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.123817 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.123830 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:33Z","lastTransitionTime":"2026-02-17T09:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.226642 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.226700 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.226718 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.226744 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.226788 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:33Z","lastTransitionTime":"2026-02-17T09:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.329197 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.329239 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.329250 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.329264 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.329273 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:33Z","lastTransitionTime":"2026-02-17T09:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.366012 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 12:36:18.118181242 +0000 UTC Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.382345 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:33 crc kubenswrapper[4848]: E0217 09:06:33.382628 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.382676 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.382798 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:33 crc kubenswrapper[4848]: E0217 09:06:33.382847 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.382652 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:33 crc kubenswrapper[4848]: E0217 09:06:33.383087 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:33 crc kubenswrapper[4848]: E0217 09:06:33.383222 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.401954 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.420977 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.431796 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.431860 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.431883 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.431913 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.431936 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:33Z","lastTransitionTime":"2026-02-17T09:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.439580 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.451930 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.477036 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f68b2b34f072d2a0ec96dc2126a480b5ffbf9b47d693991a47291f26f775547d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68b2b34f072d2a0ec96dc2126a480b5ffbf9b47d693991a47291f26f775547d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:06:17Z\\\",\\\"message\\\":\\\".361707 6524 services_controller.go:451] Built service openshift-etcd/etcd cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 09:06:17.361795 6524 services_controller.go:452] Built service openshift-etcd/etcd per-node LB for network=default: []services.LB{}\\\\nI0217 09:06:17.361812 6524 services_controller.go:453] Built service openshift-etcd/etcd template LB for network=default: []services.LB{}\\\\nI0217 09:06:17.361828 6524 services_controller.go:454] Service openshift-etcd/etcd for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0217 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:06:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4fvgf_openshift-ovn-kubernetes(2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.498740 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a85b1d98b8892e543f98f9282c0894c6404171391a3943a85226186e43b66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.516041 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.534526 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.534683 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.534716 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.534727 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.534745 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.534777 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:33Z","lastTransitionTime":"2026-02-17T09:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.550426 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.566868 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7de1a57-0b76-454f-bc44-ad6632f90e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e5115c6c6df4b20118349fec582725156c26e2dbc6108060f6712eff903c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eda2256c29f83ccd246bf001fea4d97537c17b395496e4334c79203d7290123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-65rnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.581020 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78r6x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98bfddd8-4a1a-4b90-973a-adb75b02fdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78r6x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.595516 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06bff994-c885-44d9-bdb7-30b93dde2005\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca3c8c541c67663ceec2edf95439371077c87b4040a0ce7656c8bf72d601cd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eacf65ee572ba8ca73ed4cc5368d381803152a667577f11c4f91a7dc763256a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ea9449101d695309f91bf3b9e7639d6451b3ef988b42658cc43411c83148857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12f6b874bf0e3bb786d81941053c5c37b47f5cdbd2502e8da344de2de288ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12f6b874bf0e3bb786d81941053c5c37b47f5cdbd2502e8da344de2de288ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.615615 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.634565 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.638246 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.638319 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.638347 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.638378 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.638400 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:33Z","lastTransitionTime":"2026-02-17T09:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.651884 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.669119 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.690279 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:33Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.740505 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.740544 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.740562 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.740578 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.740588 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:33Z","lastTransitionTime":"2026-02-17T09:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.842960 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.843211 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.843319 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.843395 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.843455 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:33Z","lastTransitionTime":"2026-02-17T09:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.946274 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.946530 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.946591 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.946665 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.946726 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:33Z","lastTransitionTime":"2026-02-17T09:06:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:33 crc kubenswrapper[4848]: I0217 09:06:33.953608 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs\") pod \"network-metrics-daemon-78r6x\" (UID: \"98bfddd8-4a1a-4b90-973a-adb75b02fdba\") " pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:33 crc kubenswrapper[4848]: E0217 09:06:33.953714 4848 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 09:06:33 crc kubenswrapper[4848]: E0217 09:06:33.953780 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs podName:98bfddd8-4a1a-4b90-973a-adb75b02fdba nodeName:}" failed. No retries permitted until 2026-02-17 09:07:05.953751272 +0000 UTC m=+103.497006908 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs") pod "network-metrics-daemon-78r6x" (UID: "98bfddd8-4a1a-4b90-973a-adb75b02fdba") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.049092 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.049144 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.049160 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.049214 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.049232 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:34Z","lastTransitionTime":"2026-02-17T09:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.151796 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.152008 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.152067 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.152126 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.152235 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:34Z","lastTransitionTime":"2026-02-17T09:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.254791 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.255031 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.255121 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.255196 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.255252 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:34Z","lastTransitionTime":"2026-02-17T09:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.358005 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.358066 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.358083 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.358106 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.358125 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:34Z","lastTransitionTime":"2026-02-17T09:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.366265 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 04:49:42.174855643 +0000 UTC Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.461734 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.461989 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.462049 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.462117 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.462174 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:34Z","lastTransitionTime":"2026-02-17T09:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.564518 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.564788 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.564971 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.565036 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.565099 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:34Z","lastTransitionTime":"2026-02-17T09:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.667200 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.667243 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.667259 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.667276 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.667288 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:34Z","lastTransitionTime":"2026-02-17T09:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.769059 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.769100 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.769108 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.769128 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.769136 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:34Z","lastTransitionTime":"2026-02-17T09:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.829925 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6rgmx_ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6/kube-multus/0.log" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.829995 4848 generic.go:334] "Generic (PLEG): container finished" podID="ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6" containerID="294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db" exitCode=1 Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.830030 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6rgmx" event={"ID":"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6","Type":"ContainerDied","Data":"294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db"} Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.830470 4848 scope.go:117] "RemoveContainer" containerID="294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.842636 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.856067 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.867431 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.871031 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.871078 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.871087 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.871101 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.871110 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:34Z","lastTransitionTime":"2026-02-17T09:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.886366 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f68b2b34f072d2a0ec96dc2126a480b5ffbf9b47d693991a47291f26f775547d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68b2b34f072d2a0ec96dc2126a480b5ffbf9b47d693991a47291f26f775547d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:06:17Z\\\",\\\"message\\\":\\\".361707 6524 services_controller.go:451] Built service openshift-etcd/etcd cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 09:06:17.361795 6524 services_controller.go:452] Built service openshift-etcd/etcd per-node LB for network=default: []services.LB{}\\\\nI0217 09:06:17.361812 6524 services_controller.go:453] Built service openshift-etcd/etcd template LB for network=default: []services.LB{}\\\\nI0217 09:06:17.361828 6524 services_controller.go:454] Service openshift-etcd/etcd for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0217 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:06:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4fvgf_openshift-ovn-kubernetes(2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.908138 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a85b1d98b8892e543f98f9282c0894c6404171391a3943a85226186e43b66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.921922 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.935836 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.945895 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.957527 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7de1a57-0b76-454f-bc44-ad6632f90e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e5115c6c6df4b20118349fec582725156c26e2dbc6108060f6712eff903c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eda2256c29f83ccd246bf001fea4d97537c17b395496e4334c79203d7290123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-65rnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.968108 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78r6x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98bfddd8-4a1a-4b90-973a-adb75b02fdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78r6x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.972988 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.973064 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.973082 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.973109 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.973126 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:34Z","lastTransitionTime":"2026-02-17T09:06:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.979226 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06bff994-c885-44d9-bdb7-30b93dde2005\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca3c8c541c67663ceec2edf95439371077c87b4040a0ce7656c8bf72d601cd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eacf65ee572ba8ca73ed4cc5368d381803152a667577f11c4f91a7dc763256a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ea9449101d695309f91bf3b9e7639d6451b3ef988b42658cc43411c83148857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12f6b874bf0e3bb786d81941053c5c37b47f5cdbd2502e8da344de2de288ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12f6b874bf0e3bb786d81941053c5c37b47f5cdbd2502e8da344de2de288ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:34 crc kubenswrapper[4848]: I0217 09:06:34.994626 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:34Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.006508 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:35Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.017173 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:35Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.045212 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:35Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.057255 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:06:34Z\\\",\\\"message\\\":\\\"2026-02-17T09:05:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_750d0a12-3063-4d55-a4e6-56c8b6cfe6a6\\\\n2026-02-17T09:05:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_750d0a12-3063-4d55-a4e6-56c8b6cfe6a6 to /host/opt/cni/bin/\\\\n2026-02-17T09:05:49Z [verbose] multus-daemon started\\\\n2026-02-17T09:05:49Z [verbose] Readiness Indicator file check\\\\n2026-02-17T09:06:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:35Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.069795 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:35Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.075333 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.075475 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.075555 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.075648 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.075746 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:35Z","lastTransitionTime":"2026-02-17T09:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.177670 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.177967 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.178029 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.178098 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.178162 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:35Z","lastTransitionTime":"2026-02-17T09:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.279726 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.279774 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.279783 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.279796 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.279805 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:35Z","lastTransitionTime":"2026-02-17T09:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.367018 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 18:44:02.410749894 +0000 UTC Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.381436 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.381563 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.381641 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.381710 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.381789 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:35Z","lastTransitionTime":"2026-02-17T09:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.382685 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.382718 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.382700 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.382700 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:35 crc kubenswrapper[4848]: E0217 09:06:35.382829 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:35 crc kubenswrapper[4848]: E0217 09:06:35.382894 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:35 crc kubenswrapper[4848]: E0217 09:06:35.382974 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:35 crc kubenswrapper[4848]: E0217 09:06:35.383053 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.484227 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.484516 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.484605 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.484674 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.484734 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:35Z","lastTransitionTime":"2026-02-17T09:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.587135 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.587190 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.587201 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.587219 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.587231 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:35Z","lastTransitionTime":"2026-02-17T09:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.689893 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.690252 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.690400 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.690535 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.690679 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:35Z","lastTransitionTime":"2026-02-17T09:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.793547 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.793982 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.794145 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.794285 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.794427 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:35Z","lastTransitionTime":"2026-02-17T09:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.836216 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6rgmx_ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6/kube-multus/0.log" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.836273 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6rgmx" event={"ID":"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6","Type":"ContainerStarted","Data":"819c45bf58d7144faff1ddfb973187d2fa7bced89ad4d5d32ad750d5dc4f39ab"} Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.850449 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06bff994-c885-44d9-bdb7-30b93dde2005\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca3c8c541c67663ceec2edf95439371077c87b4040a0ce7656c8bf72d601cd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eacf65ee572ba8ca73ed4cc5368d381803152a667577f11c4f91a7dc763256a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ea9449101d695309f91bf3b9e7639d6451b3ef988b42658cc43411c83148857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12f6b874bf0e3bb786d81941053c5c37b47f5cdbd2502e8da344de2de288ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12f6b874bf0e3bb786d81941053c5c37b47f5cdbd2502e8da344de2de288ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:35Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.875184 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:35Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.890803 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:35Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.897047 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.897103 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.897116 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.897130 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.897141 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:35Z","lastTransitionTime":"2026-02-17T09:06:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.907040 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:35Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.920595 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:35Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.936204 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819c45bf58d7144faff1ddfb973187d2fa7bced89ad4d5d32ad750d5dc4f39ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:06:34Z\\\",\\\"message\\\":\\\"2026-02-17T09:05:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_750d0a12-3063-4d55-a4e6-56c8b6cfe6a6\\\\n2026-02-17T09:05:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_750d0a12-3063-4d55-a4e6-56c8b6cfe6a6 to /host/opt/cni/bin/\\\\n2026-02-17T09:05:49Z [verbose] multus-daemon started\\\\n2026-02-17T09:05:49Z [verbose] Readiness Indicator file check\\\\n2026-02-17T09:06:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:35Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.949550 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:35Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.966878 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:35Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.981431 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:35Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:35 crc kubenswrapper[4848]: I0217 09:06:35.994161 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:35Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.001912 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.001968 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.001978 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.001994 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.002010 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:36Z","lastTransitionTime":"2026-02-17T09:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.016712 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f68b2b34f072d2a0ec96dc2126a480b5ffbf9b47d693991a47291f26f775547d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68b2b34f072d2a0ec96dc2126a480b5ffbf9b47d693991a47291f26f775547d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:06:17Z\\\",\\\"message\\\":\\\".361707 6524 services_controller.go:451] Built service openshift-etcd/etcd cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 09:06:17.361795 6524 services_controller.go:452] Built service openshift-etcd/etcd per-node LB for network=default: []services.LB{}\\\\nI0217 09:06:17.361812 6524 services_controller.go:453] Built service openshift-etcd/etcd template LB for network=default: []services.LB{}\\\\nI0217 09:06:17.361828 6524 services_controller.go:454] Service openshift-etcd/etcd for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0217 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:06:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4fvgf_openshift-ovn-kubernetes(2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:36Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.038063 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a85b1d98b8892e543f98f9282c0894c6404171391a3943a85226186e43b66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:36Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.052956 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:36Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.066795 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:36Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.078055 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:36Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.089319 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7de1a57-0b76-454f-bc44-ad6632f90e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e5115c6c6df4b20118349fec582725156c26e2dbc6108060f6712eff903c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eda2256c29f83ccd246bf001fea4d97537c17b395496e4334c79203d7290123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-65rnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:36Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.102356 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78r6x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98bfddd8-4a1a-4b90-973a-adb75b02fdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78r6x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:36Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.104390 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.104420 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.104429 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.104442 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.104450 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:36Z","lastTransitionTime":"2026-02-17T09:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.206918 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.206965 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.206976 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.206991 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.207001 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:36Z","lastTransitionTime":"2026-02-17T09:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.310303 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.310369 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.310384 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.310409 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.310426 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:36Z","lastTransitionTime":"2026-02-17T09:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.367254 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 02:31:41.414436944 +0000 UTC Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.412647 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.412704 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.412720 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.412743 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.412805 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:36Z","lastTransitionTime":"2026-02-17T09:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.516028 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.516076 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.516094 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.516117 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.516133 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:36Z","lastTransitionTime":"2026-02-17T09:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.619667 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.619745 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.619797 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.619827 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.619849 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:36Z","lastTransitionTime":"2026-02-17T09:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.722525 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.722574 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.722623 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.722651 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.722671 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:36Z","lastTransitionTime":"2026-02-17T09:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.825214 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.825281 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.825300 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.825323 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.825340 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:36Z","lastTransitionTime":"2026-02-17T09:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.927902 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.927931 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.927938 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.927951 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:36 crc kubenswrapper[4848]: I0217 09:06:36.927959 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:36Z","lastTransitionTime":"2026-02-17T09:06:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.030616 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.030930 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.031014 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.031085 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.031182 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:37Z","lastTransitionTime":"2026-02-17T09:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.134102 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.134144 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.134155 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.134172 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.134186 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:37Z","lastTransitionTime":"2026-02-17T09:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.236518 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.236840 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.236954 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.237000 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.237015 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:37Z","lastTransitionTime":"2026-02-17T09:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.338693 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.338726 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.338736 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.338749 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.338780 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:37Z","lastTransitionTime":"2026-02-17T09:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.368298 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 08:31:56.763469595 +0000 UTC Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.382751 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:37 crc kubenswrapper[4848]: E0217 09:06:37.382893 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.382754 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.383029 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.382986 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:37 crc kubenswrapper[4848]: E0217 09:06:37.383139 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:37 crc kubenswrapper[4848]: E0217 09:06:37.383267 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:37 crc kubenswrapper[4848]: E0217 09:06:37.383309 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.442170 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.442237 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.442255 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.442280 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.442297 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:37Z","lastTransitionTime":"2026-02-17T09:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.545281 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.545369 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.545393 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.545419 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.545437 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:37Z","lastTransitionTime":"2026-02-17T09:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.648610 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.648667 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.648683 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.648706 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.648725 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:37Z","lastTransitionTime":"2026-02-17T09:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.752304 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.752623 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.752897 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.753262 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.753536 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:37Z","lastTransitionTime":"2026-02-17T09:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.856611 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.856656 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.856668 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.856683 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.856694 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:37Z","lastTransitionTime":"2026-02-17T09:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.960365 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.960421 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.960434 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.960452 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:37 crc kubenswrapper[4848]: I0217 09:06:37.960464 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:37Z","lastTransitionTime":"2026-02-17T09:06:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.063203 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.063592 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.063805 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.063979 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.064111 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:38Z","lastTransitionTime":"2026-02-17T09:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.167352 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.167388 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.167400 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.167416 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.167427 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:38Z","lastTransitionTime":"2026-02-17T09:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.279170 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.279253 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.279343 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.279415 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.279477 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:38Z","lastTransitionTime":"2026-02-17T09:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.368940 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 06:02:33.9373225 +0000 UTC Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.383481 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.383684 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.383778 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.383859 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.383920 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:38Z","lastTransitionTime":"2026-02-17T09:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.486590 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.486978 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.487290 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.487624 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.488023 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:38Z","lastTransitionTime":"2026-02-17T09:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.591553 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.591612 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.591623 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.591650 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.591662 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:38Z","lastTransitionTime":"2026-02-17T09:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.694973 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.695337 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.695587 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.695879 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.696035 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:38Z","lastTransitionTime":"2026-02-17T09:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.798633 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.799174 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.799359 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.799512 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.799707 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:38Z","lastTransitionTime":"2026-02-17T09:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.902658 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.902713 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.902728 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.902748 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:38 crc kubenswrapper[4848]: I0217 09:06:38.902843 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:38Z","lastTransitionTime":"2026-02-17T09:06:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.006722 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.007126 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.007300 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.007470 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.007672 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:39Z","lastTransitionTime":"2026-02-17T09:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.110465 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.110526 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.110544 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.110565 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.110581 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:39Z","lastTransitionTime":"2026-02-17T09:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.213583 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.213671 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.213698 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.213723 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.213741 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:39Z","lastTransitionTime":"2026-02-17T09:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.317540 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.318051 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.318130 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.318168 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.318192 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:39Z","lastTransitionTime":"2026-02-17T09:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.370189 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 17:01:32.025104911 +0000 UTC Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.382625 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:39 crc kubenswrapper[4848]: E0217 09:06:39.382922 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.383081 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.383081 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:39 crc kubenswrapper[4848]: E0217 09:06:39.383263 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.383291 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:39 crc kubenswrapper[4848]: E0217 09:06:39.383360 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:39 crc kubenswrapper[4848]: E0217 09:06:39.383466 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.421239 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.421529 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.421807 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.422017 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.422204 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:39Z","lastTransitionTime":"2026-02-17T09:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.526178 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.526246 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.526260 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.526286 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.526301 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:39Z","lastTransitionTime":"2026-02-17T09:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.629473 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.629547 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.629567 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.629596 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.629612 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:39Z","lastTransitionTime":"2026-02-17T09:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.733277 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.733349 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.733373 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.733403 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.733427 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:39Z","lastTransitionTime":"2026-02-17T09:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.836961 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.837010 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.837022 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.837043 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.837058 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:39Z","lastTransitionTime":"2026-02-17T09:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.940976 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.941640 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.941959 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.942181 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:39 crc kubenswrapper[4848]: I0217 09:06:39.942338 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:39Z","lastTransitionTime":"2026-02-17T09:06:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.045304 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.045364 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.045381 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.045406 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.045425 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:40Z","lastTransitionTime":"2026-02-17T09:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.148142 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.148195 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.148207 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.148236 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.148250 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:40Z","lastTransitionTime":"2026-02-17T09:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.252673 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.252735 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.252749 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.252796 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.252811 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:40Z","lastTransitionTime":"2026-02-17T09:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.355890 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.355956 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.355970 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.355993 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.356008 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:40Z","lastTransitionTime":"2026-02-17T09:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.362611 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.362697 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.362724 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.362753 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.362844 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:40Z","lastTransitionTime":"2026-02-17T09:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.371323 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 06:56:13.959676229 +0000 UTC Feb 17 09:06:40 crc kubenswrapper[4848]: E0217 09:06:40.387187 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.392270 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.392308 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.392321 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.392339 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.392353 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:40Z","lastTransitionTime":"2026-02-17T09:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:40 crc kubenswrapper[4848]: E0217 09:06:40.409896 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.418218 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.418293 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.418320 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.418362 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.418389 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:40Z","lastTransitionTime":"2026-02-17T09:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:40 crc kubenswrapper[4848]: E0217 09:06:40.445784 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.454992 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.455048 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.455065 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.455092 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.455110 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:40Z","lastTransitionTime":"2026-02-17T09:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:40 crc kubenswrapper[4848]: E0217 09:06:40.476676 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.482040 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.482112 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.482126 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.482149 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.482165 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:40Z","lastTransitionTime":"2026-02-17T09:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:40 crc kubenswrapper[4848]: E0217 09:06:40.499482 4848 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"1f8980ef-de02-4b2d-9798-0aff268a6b81\\\",\\\"systemUUID\\\":\\\"8c2b9c04-4f3d-4d42-8125-29db61982ba4\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:40Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:40 crc kubenswrapper[4848]: E0217 09:06:40.499655 4848 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.501889 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.501961 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.501980 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.502001 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.502017 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:40Z","lastTransitionTime":"2026-02-17T09:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.605319 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.605382 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.605396 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.605419 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.605436 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:40Z","lastTransitionTime":"2026-02-17T09:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.709164 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.709229 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.709249 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.709278 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.709299 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:40Z","lastTransitionTime":"2026-02-17T09:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.813153 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.813212 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.813227 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.813255 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.813272 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:40Z","lastTransitionTime":"2026-02-17T09:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.916835 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.916880 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.916893 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.916910 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:40 crc kubenswrapper[4848]: I0217 09:06:40.916922 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:40Z","lastTransitionTime":"2026-02-17T09:06:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.020508 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.020574 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.020588 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.020616 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.020634 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:41Z","lastTransitionTime":"2026-02-17T09:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.123913 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.123964 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.123975 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.123993 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.124005 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:41Z","lastTransitionTime":"2026-02-17T09:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.227280 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.227321 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.227331 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.227347 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.227358 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:41Z","lastTransitionTime":"2026-02-17T09:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.330924 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.330981 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.330992 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.331015 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.331027 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:41Z","lastTransitionTime":"2026-02-17T09:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.371966 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 11:43:04.196957228 +0000 UTC Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.383588 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.383588 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.383620 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.383736 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:41 crc kubenswrapper[4848]: E0217 09:06:41.383927 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:41 crc kubenswrapper[4848]: E0217 09:06:41.384331 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:41 crc kubenswrapper[4848]: E0217 09:06:41.384380 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:41 crc kubenswrapper[4848]: E0217 09:06:41.384483 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.434496 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.434563 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.434580 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.434614 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.434635 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:41Z","lastTransitionTime":"2026-02-17T09:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.539129 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.539213 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.539250 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.539286 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.539309 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:41Z","lastTransitionTime":"2026-02-17T09:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.642879 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.642937 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.642950 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.642970 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.642989 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:41Z","lastTransitionTime":"2026-02-17T09:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.747049 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.747147 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.747175 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.747219 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.747252 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:41Z","lastTransitionTime":"2026-02-17T09:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.850754 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.850818 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.850827 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.850845 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.850858 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:41Z","lastTransitionTime":"2026-02-17T09:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.954180 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.954234 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.954247 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.954269 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:41 crc kubenswrapper[4848]: I0217 09:06:41.954289 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:41Z","lastTransitionTime":"2026-02-17T09:06:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.058354 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.058425 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.058443 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.058470 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.058493 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:42Z","lastTransitionTime":"2026-02-17T09:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.161861 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.161935 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.161947 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.161968 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.161981 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:42Z","lastTransitionTime":"2026-02-17T09:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.266577 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.266672 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.266698 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.266743 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.266792 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:42Z","lastTransitionTime":"2026-02-17T09:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.371258 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.371338 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.371369 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.371406 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.371432 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:42Z","lastTransitionTime":"2026-02-17T09:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.372160 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 20:55:37.228328719 +0000 UTC Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.475453 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.476073 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.476231 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.476387 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.476535 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:42Z","lastTransitionTime":"2026-02-17T09:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.581334 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.581395 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.581415 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.581445 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.581467 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:42Z","lastTransitionTime":"2026-02-17T09:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.685704 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.686113 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.686231 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.686309 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.686391 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:42Z","lastTransitionTime":"2026-02-17T09:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.791413 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.791898 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.791972 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.792050 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.792114 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:42Z","lastTransitionTime":"2026-02-17T09:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.895140 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.895200 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.895211 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.895234 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.895249 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:42Z","lastTransitionTime":"2026-02-17T09:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.998365 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.998404 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.998413 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.998429 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:42 crc kubenswrapper[4848]: I0217 09:06:42.998441 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:42Z","lastTransitionTime":"2026-02-17T09:06:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.102253 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.102337 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.102361 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.102390 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.102412 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:43Z","lastTransitionTime":"2026-02-17T09:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.205833 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.205889 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.205904 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.205930 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.205948 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:43Z","lastTransitionTime":"2026-02-17T09:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.309725 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.309812 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.309824 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.309847 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.309859 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:43Z","lastTransitionTime":"2026-02-17T09:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.372499 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 10:19:30.152730759 +0000 UTC Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.383206 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:43 crc kubenswrapper[4848]: E0217 09:06:43.383412 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.383434 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.383564 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:43 crc kubenswrapper[4848]: E0217 09:06:43.383609 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.383479 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:43 crc kubenswrapper[4848]: E0217 09:06:43.383744 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:43 crc kubenswrapper[4848]: E0217 09:06:43.383897 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.408313 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.414044 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.414107 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.414129 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.414165 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.414189 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:43Z","lastTransitionTime":"2026-02-17T09:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.427792 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.444267 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.460829 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7de1a57-0b76-454f-bc44-ad6632f90e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e5115c6c6df4b20118349fec582725156c26e2dbc6108060f6712eff903c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eda2256c29f83ccd246bf001fea4d97537c17b395496e4334c79203d7290123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-65rnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.476140 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78r6x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98bfddd8-4a1a-4b90-973a-adb75b02fdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78r6x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.490188 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06bff994-c885-44d9-bdb7-30b93dde2005\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca3c8c541c67663ceec2edf95439371077c87b4040a0ce7656c8bf72d601cd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eacf65ee572ba8ca73ed4cc5368d381803152a667577f11c4f91a7dc763256a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ea9449101d695309f91bf3b9e7639d6451b3ef988b42658cc43411c83148857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12f6b874bf0e3bb786d81941053c5c37b47f5cdbd2502e8da344de2de288ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12f6b874bf0e3bb786d81941053c5c37b47f5cdbd2502e8da344de2de288ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.510275 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.516925 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.517022 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.517095 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.517138 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.517166 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:43Z","lastTransitionTime":"2026-02-17T09:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.526706 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.543874 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.560776 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.578850 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819c45bf58d7144faff1ddfb973187d2fa7bced89ad4d5d32ad750d5dc4f39ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:06:34Z\\\",\\\"message\\\":\\\"2026-02-17T09:05:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_750d0a12-3063-4d55-a4e6-56c8b6cfe6a6\\\\n2026-02-17T09:05:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_750d0a12-3063-4d55-a4e6-56c8b6cfe6a6 to /host/opt/cni/bin/\\\\n2026-02-17T09:05:49Z [verbose] multus-daemon started\\\\n2026-02-17T09:05:49Z [verbose] Readiness Indicator file check\\\\n2026-02-17T09:06:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.593910 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.616227 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.621802 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.621854 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.621936 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.621966 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.621984 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:43Z","lastTransitionTime":"2026-02-17T09:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.631061 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.641689 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.662722 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f68b2b34f072d2a0ec96dc2126a480b5ffbf9b47d693991a47291f26f775547d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68b2b34f072d2a0ec96dc2126a480b5ffbf9b47d693991a47291f26f775547d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:06:17Z\\\",\\\"message\\\":\\\".361707 6524 services_controller.go:451] Built service openshift-etcd/etcd cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 09:06:17.361795 6524 services_controller.go:452] Built service openshift-etcd/etcd per-node LB for network=default: []services.LB{}\\\\nI0217 09:06:17.361812 6524 services_controller.go:453] Built service openshift-etcd/etcd template LB for network=default: []services.LB{}\\\\nI0217 09:06:17.361828 6524 services_controller.go:454] Service openshift-etcd/etcd for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0217 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:06:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-4fvgf_openshift-ovn-kubernetes(2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.679740 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a85b1d98b8892e543f98f9282c0894c6404171391a3943a85226186e43b66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:43Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.725910 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.726639 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.726721 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.726819 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.726889 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:43Z","lastTransitionTime":"2026-02-17T09:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.830541 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.830615 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.830635 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.830662 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.830683 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:43Z","lastTransitionTime":"2026-02-17T09:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.934098 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.934571 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.934660 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.934793 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:43 crc kubenswrapper[4848]: I0217 09:06:43.934901 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:43Z","lastTransitionTime":"2026-02-17T09:06:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.038457 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.038522 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.038545 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.038579 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.038607 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:44Z","lastTransitionTime":"2026-02-17T09:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.141879 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.141933 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.141951 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.141975 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.141991 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:44Z","lastTransitionTime":"2026-02-17T09:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.245191 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.245243 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.245260 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.245282 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.245299 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:44Z","lastTransitionTime":"2026-02-17T09:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.348069 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.348448 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.348616 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.348810 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.348987 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:44Z","lastTransitionTime":"2026-02-17T09:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.373516 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 12:26:42.534242729 +0000 UTC Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.385168 4848 scope.go:117] "RemoveContainer" containerID="f68b2b34f072d2a0ec96dc2126a480b5ffbf9b47d693991a47291f26f775547d" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.416744 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.452011 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.452505 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.452717 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.452958 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.453136 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:44Z","lastTransitionTime":"2026-02-17T09:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.556577 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.556627 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.556641 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.556659 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.556709 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:44Z","lastTransitionTime":"2026-02-17T09:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.659084 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.659162 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.659177 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.659191 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.659202 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:44Z","lastTransitionTime":"2026-02-17T09:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.763088 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.763127 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.763139 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.763156 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.763167 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:44Z","lastTransitionTime":"2026-02-17T09:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.867040 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.867072 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.867080 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.867095 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.867103 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:44Z","lastTransitionTime":"2026-02-17T09:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.869810 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4fvgf_2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8/ovnkube-controller/2.log" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.873479 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerStarted","Data":"fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624"} Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.884602 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78r6x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98bfddd8-4a1a-4b90-973a-adb75b02fdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78r6x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.901031 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.918546 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.936462 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.951401 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7de1a57-0b76-454f-bc44-ad6632f90e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e5115c6c6df4b20118349fec582725156c26e2dbc6108060f6712eff903c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eda2256c29f83ccd246bf001fea4d97537c17b395496e4334c79203d7290123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-65rnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.969707 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.969790 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.969812 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.969846 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.969872 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:44Z","lastTransitionTime":"2026-02-17T09:06:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.972642 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:44 crc kubenswrapper[4848]: I0217 09:06:44.987181 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819c45bf58d7144faff1ddfb973187d2fa7bced89ad4d5d32ad750d5dc4f39ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:06:34Z\\\",\\\"message\\\":\\\"2026-02-17T09:05:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_750d0a12-3063-4d55-a4e6-56c8b6cfe6a6\\\\n2026-02-17T09:05:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_750d0a12-3063-4d55-a4e6-56c8b6cfe6a6 to /host/opt/cni/bin/\\\\n2026-02-17T09:05:49Z [verbose] multus-daemon started\\\\n2026-02-17T09:05:49Z [verbose] Readiness Indicator file check\\\\n2026-02-17T09:06:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.000260 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06bff994-c885-44d9-bdb7-30b93dde2005\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca3c8c541c67663ceec2edf95439371077c87b4040a0ce7656c8bf72d601cd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eacf65ee572ba8ca73ed4cc5368d381803152a667577f11c4f91a7dc763256a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ea9449101d695309f91bf3b9e7639d6451b3ef988b42658cc43411c83148857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12f6b874bf0e3bb786d81941053c5c37b47f5cdbd2502e8da344de2de288ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12f6b874bf0e3bb786d81941053c5c37b47f5cdbd2502e8da344de2de288ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:44Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.015654 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.027623 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.040538 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.060076 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d27d30-0b3f-4fd7-a094-fb4f32767c13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d40404ebda7101d913e0dded89797c76100db568074a844454b0bad88c8925d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ff0c548222abded2a2a1dc81cbe500edabcca08aa10f362b45f9eee2953d43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bf73bfc6b519f2acff6ae6b29373b167447e17c560d0bd087d0278cc8fa5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24770f056283550f0ae425d5bf2150923ac3a06f35b68b0a2d18ca7c28e7aed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://101d3f4aaeb11bb146cc357ca24ae8f12cda88dc9088351c4c0765be56f91151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb53737be5ff4203f1364b05c1d070f2b64c59ecfc936d921baae95450acc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fb53737be5ff4203f1364b05c1d070f2b64c59ecfc936d921baae95450acc77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3427f712b264815c5148edafb4ff7186faccbf0bccce9512d6a4df6cc57a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3427f712b264815c5148edafb4ff7186faccbf0bccce9512d6a4df6cc57a339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c9bd5a21d79a7b9825b15552676a11c4c75c8eba4654d47508452ad923587b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9bd5a21d79a7b9825b15552676a11c4c75c8eba4654d47508452ad923587b7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.072720 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.072834 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.072856 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.072906 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.072925 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:45Z","lastTransitionTime":"2026-02-17T09:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.077643 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.136782 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68b2b34f072d2a0ec96dc2126a480b5ffbf9b47d693991a47291f26f775547d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:06:17Z\\\",\\\"message\\\":\\\".361707 6524 services_controller.go:451] Built service openshift-etcd/etcd cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 09:06:17.361795 6524 services_controller.go:452] Built service openshift-etcd/etcd per-node LB for network=default: []services.LB{}\\\\nI0217 09:06:17.361812 6524 services_controller.go:453] Built service openshift-etcd/etcd template LB for network=default: []services.LB{}\\\\nI0217 09:06:17.361828 6524 services_controller.go:454] Service openshift-etcd/etcd for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0217 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:06:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.159038 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a85b1d98b8892e543f98f9282c0894c6404171391a3943a85226186e43b66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.175286 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.175329 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.175348 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.175363 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.175374 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:45Z","lastTransitionTime":"2026-02-17T09:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.176341 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.189104 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.198551 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.278122 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.278176 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.278185 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.278199 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.278209 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:45Z","lastTransitionTime":"2026-02-17T09:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.374580 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 19:00:57.918496783 +0000 UTC Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.381052 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.381105 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.381122 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.381148 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.381165 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:45Z","lastTransitionTime":"2026-02-17T09:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.383210 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:45 crc kubenswrapper[4848]: E0217 09:06:45.383368 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.383619 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:45 crc kubenswrapper[4848]: E0217 09:06:45.383717 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.383966 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:45 crc kubenswrapper[4848]: E0217 09:06:45.384063 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.384392 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:45 crc kubenswrapper[4848]: E0217 09:06:45.384496 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.483924 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.484012 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.484032 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.484058 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.484077 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:45Z","lastTransitionTime":"2026-02-17T09:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.587522 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.587573 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.587591 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.587618 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.587641 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:45Z","lastTransitionTime":"2026-02-17T09:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.690871 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.690916 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.690926 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.690945 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.690956 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:45Z","lastTransitionTime":"2026-02-17T09:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.797843 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.797905 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.797930 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.797964 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.797986 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:45Z","lastTransitionTime":"2026-02-17T09:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.882146 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4fvgf_2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8/ovnkube-controller/3.log" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.883526 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4fvgf_2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8/ovnkube-controller/2.log" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.888557 4848 generic.go:334] "Generic (PLEG): container finished" podID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerID="fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624" exitCode=1 Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.888619 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerDied","Data":"fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624"} Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.888686 4848 scope.go:117] "RemoveContainer" containerID="f68b2b34f072d2a0ec96dc2126a480b5ffbf9b47d693991a47291f26f775547d" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.890077 4848 scope.go:117] "RemoveContainer" containerID="fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624" Feb 17 09:06:45 crc kubenswrapper[4848]: E0217 09:06:45.890385 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4fvgf_openshift-ovn-kubernetes(2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.900317 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.900367 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.900380 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.900402 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.900416 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:45Z","lastTransitionTime":"2026-02-17T09:06:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.911694 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06bff994-c885-44d9-bdb7-30b93dde2005\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca3c8c541c67663ceec2edf95439371077c87b4040a0ce7656c8bf72d601cd8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eacf65ee572ba8ca73ed4cc5368d381803152a667577f11c4f91a7dc763256a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8ea9449101d695309f91bf3b9e7639d6451b3ef988b42658cc43411c83148857\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://12f6b874bf0e3bb786d81941053c5c37b47f5cdbd2502e8da344de2de288ae5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12f6b874bf0e3bb786d81941053c5c37b47f5cdbd2502e8da344de2de288ae5e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.935721 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"32cd1450-32ba-4eaa-be52-0a6967aa4683\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-17T09:05:42Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0217 09:05:37.067937 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0217 09:05:37.069885 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2330787832/tls.crt::/tmp/serving-cert-2330787832/tls.key\\\\\\\"\\\\nI0217 09:05:42.871008 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0217 09:05:42.876386 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0217 09:05:42.876436 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0217 09:05:42.876468 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0217 09:05:42.876480 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0217 09:05:42.887953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0217 09:05:42.887999 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0217 09:05:42.888007 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0217 09:05:42.888012 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0217 09:05:42.888016 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0217 09:05:42.888047 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0217 09:05:42.888053 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0217 09:05:42.888061 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0217 09:05:42.890682 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.952728 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.969525 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43fe5d74e3dc7917310577b5a1b7e43b54dfdb49bef428affe4af3e4ebbe13bb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:45 crc kubenswrapper[4848]: I0217 09:06:45.988483 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:45Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.002892 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.002967 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.002985 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.003012 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.003092 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:46Z","lastTransitionTime":"2026-02-17T09:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.009154 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-6rgmx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://819c45bf58d7144faff1ddfb973187d2fa7bced89ad4d5d32ad750d5dc4f39ab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:06:34Z\\\",\\\"message\\\":\\\"2026-02-17T09:05:49+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_750d0a12-3063-4d55-a4e6-56c8b6cfe6a6\\\\n2026-02-17T09:05:49+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_750d0a12-3063-4d55-a4e6-56c8b6cfe6a6 to /host/opt/cni/bin/\\\\n2026-02-17T09:05:49Z [verbose] multus-daemon started\\\\n2026-02-17T09:05:49Z [verbose] Readiness Indicator file check\\\\n2026-02-17T09:06:34Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-225sp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-6rgmx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:46Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.048448 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9d27d30-0b3f-4fd7-a094-fb4f32767c13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6d40404ebda7101d913e0dded89797c76100db568074a844454b0bad88c8925d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8ff0c548222abded2a2a1dc81cbe500edabcca08aa10f362b45f9eee2953d43\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df6bf73bfc6b519f2acff6ae6b29373b167447e17c560d0bd087d0278cc8fa5c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c24770f056283550f0ae425d5bf2150923ac3a06f35b68b0a2d18ca7c28e7aed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://101d3f4aaeb11bb146cc357ca24ae8f12cda88dc9088351c4c0765be56f91151\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7fb53737be5ff4203f1364b05c1d070f2b64c59ecfc936d921baae95450acc77\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fb53737be5ff4203f1364b05c1d070f2b64c59ecfc936d921baae95450acc77\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3427f712b264815c5148edafb4ff7186faccbf0bccce9512d6a4df6cc57a339\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3427f712b264815c5148edafb4ff7186faccbf0bccce9512d6a4df6cc57a339\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://c9bd5a21d79a7b9825b15552676a11c4c75c8eba4654d47508452ad923587b7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c9bd5a21d79a7b9825b15552676a11c4c75c8eba4654d47508452ad923587b7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:46Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.072528 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c28fed4-873d-42f6-ae63-03d12a425d0a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://250c247ca1e4de22785ba9eba974427ad4a054a9d0776758e2634427f624b43e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-m7nzp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-stvnz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:46Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.090420 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://43a599c36a6dd13aafe633b6a9dcd6e1d91863812c5a0ffe58ef867027a47ddd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:46Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.106400 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.106705 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.106825 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.106947 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.107040 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:46Z","lastTransitionTime":"2026-02-17T09:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.111561 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64abeca6bccfb6127def298a47b03459f21e3b954e9258b2e3ce017482e6f21e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b67c0e5247aab1e211439991de6dfa41944b8fda6a8a093c70dba3a1285a1538\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:46Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.128070 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-gv2xs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5526fe6c-c04c-4ef2-a482-be066235c702\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5583b74f927fbf95b40c09df8bac1479498cf1eddc3e850c9a69d884fa8492e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-z8lb7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-gv2xs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:46Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.155549 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f68b2b34f072d2a0ec96dc2126a480b5ffbf9b47d693991a47291f26f775547d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:06:17Z\\\",\\\"message\\\":\\\".361707 6524 services_controller.go:451] Built service openshift-etcd/etcd cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-etcd/etcd_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-etcd/etcd\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:2379, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}, services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.253\\\\\\\", Port:9979, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0217 09:06:17.361795 6524 services_controller.go:452] Built service openshift-etcd/etcd per-node LB for network=default: []services.LB{}\\\\nI0217 09:06:17.361812 6524 services_controller.go:453] Built service openshift-etcd/etcd template LB for network=default: []services.LB{}\\\\nI0217 09:06:17.361828 6524 services_controller.go:454] Service openshift-etcd/etcd for network=default has 2 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI0217 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:06:16Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-17T09:06:45Z\\\",\\\"message\\\":\\\" 6976 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0217 09:06:45.585258 6976 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0217 09:06:45.585424 6976 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0217 09:06:45.585537 6976 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0217 09:06:45.585588 6976 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0217 09:06:45.585712 6976 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0217 09:06:45.585541 6976 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0217 09:06:45.585746 6976 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0217 09:06:45.585823 6976 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0217 09:06:45.585829 6976 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0217 09:06:45.585943 6976 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0217 09:06:45.586009 6976 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0217 09:06:45.586113 6976 factory.go:656] Stopping watch factory\\\\nI0217 09:06:45.586155 6976 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0217 09:06:45.586168 6976 handler.go:208] Removed *v1.Node event handler 2\\\\nI0217 09:06:45.586198 6976 handler.go:208] Removed *v1.Node ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-17T09:06:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lwttz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-4fvgf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:46Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.178247 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-t94zv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3244ef77-7b63-45b9-9b12-2b12cb6654df\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4a85b1d98b8892e543f98f9282c0894c6404171391a3943a85226186e43b66b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://52a21f2523039d358316f8ade11b4d7d4fdc3c0d20bd1ba1964377ebb388aee2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c5f2d832a892c1c219b6c1543af4dcd95637760be71ca7e5eb74778c8d0f8231\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0cf32f3b977a1328540891a8a232499917547933a761a6f0fb1e444cd63d343f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://307098be6ef6aa96db199103a2d5344f57b914b3f2dec88ebc4dd753c7192722\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22f3feaf78190fec059c3547fa59e24f3583e2522f7128bbf39868f18d8550d4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831c7dfada4b5677d2b16c0ed82e4fa99675091215e04ed61e26014561f76f1e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-17T09:05:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-17T09:05:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nngkk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-t94zv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:46Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.196699 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2e50f270-f5e1-42fe-8f11-5e294ab12252\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f0ffbd53340b5452bde053ac1f6fc6f88992d5f3fd49b7a7abee8d497d360a20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3e82d1dba583f107d5beac62ef078185e5d183e5474ddde5051c8abbc85a64f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25bef8cb345bd7dacf1bf49935bcbed7a404533048c10bb0e231c67ceb9feb33\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:23Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:46Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.210023 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.210082 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.210103 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.210130 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.210149 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:46Z","lastTransitionTime":"2026-02-17T09:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.216958 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:46Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.233346 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-fn2gp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb23008-bf50-4af1-812e-b8fa98dda9bf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:05:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d9e3eeb03fc0f6b952e6a8f1f8d4ed70646c979b66aeee28f79790f3bce3c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:05:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5hpm5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:05:47Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-fn2gp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:46Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.249755 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7de1a57-0b76-454f-bc44-ad6632f90e5a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e5115c6c6df4b20118349fec582725156c26e2dbc6108060f6712eff903c33\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9eda2256c29f83ccd246bf001fea4d97537c17b395496e4334c79203d7290123\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-17T09:06:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bs96q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:00Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-65rnv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:46Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.264368 4848 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-78r6x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"98bfddd8-4a1a-4b90-973a-adb75b02fdba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-17T09:06:01Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq5wq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-17T09:06:01Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-78r6x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-17T09:06:46Z is after 2025-08-24T17:21:41Z" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.313872 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.314352 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.314442 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.314573 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.314670 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:46Z","lastTransitionTime":"2026-02-17T09:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.375821 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 01:38:30.457975941 +0000 UTC Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.418241 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.418308 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.418333 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.418360 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.418378 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:46Z","lastTransitionTime":"2026-02-17T09:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.522152 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.522210 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.522228 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.522257 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.522281 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:46Z","lastTransitionTime":"2026-02-17T09:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.625132 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.625231 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.625242 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.625266 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.625278 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:46Z","lastTransitionTime":"2026-02-17T09:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.728736 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.728815 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.728831 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.728855 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.728872 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:46Z","lastTransitionTime":"2026-02-17T09:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.832282 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.832360 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.832372 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.832392 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.832404 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:46Z","lastTransitionTime":"2026-02-17T09:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.895114 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4fvgf_2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8/ovnkube-controller/3.log" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.935364 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.935440 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.935467 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.935501 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:46 crc kubenswrapper[4848]: I0217 09:06:46.935575 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:46Z","lastTransitionTime":"2026-02-17T09:06:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.039430 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.039495 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.039518 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.039545 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.039569 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:47Z","lastTransitionTime":"2026-02-17T09:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.143278 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.143394 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.143421 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.143453 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.143474 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:47Z","lastTransitionTime":"2026-02-17T09:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.215659 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:06:47 crc kubenswrapper[4848]: E0217 09:06:47.216007 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:51.215954855 +0000 UTC m=+148.759210541 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.216234 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.216392 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.216469 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.216550 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:47 crc kubenswrapper[4848]: E0217 09:06:47.216561 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 09:06:47 crc kubenswrapper[4848]: E0217 09:06:47.216634 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 09:06:47 crc kubenswrapper[4848]: E0217 09:06:47.216663 4848 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:06:47 crc kubenswrapper[4848]: E0217 09:06:47.216668 4848 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 09:06:47 crc kubenswrapper[4848]: E0217 09:06:47.216728 4848 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 09:06:47 crc kubenswrapper[4848]: E0217 09:06:47.216743 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 17 09:06:47 crc kubenswrapper[4848]: E0217 09:06:47.216811 4848 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 17 09:06:47 crc kubenswrapper[4848]: E0217 09:06:47.216815 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-17 09:07:51.216736517 +0000 UTC m=+148.759992233 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:06:47 crc kubenswrapper[4848]: E0217 09:06:47.216834 4848 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:06:47 crc kubenswrapper[4848]: E0217 09:06:47.216863 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 09:07:51.21684082 +0000 UTC m=+148.760096596 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 17 09:06:47 crc kubenswrapper[4848]: E0217 09:06:47.216907 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-17 09:07:51.216885021 +0000 UTC m=+148.760140937 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 17 09:06:47 crc kubenswrapper[4848]: E0217 09:06:47.216938 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-17 09:07:51.216922672 +0000 UTC m=+148.760178518 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.246720 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.246831 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.246860 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.246896 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.246922 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:47Z","lastTransitionTime":"2026-02-17T09:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.350673 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.350738 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.350794 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.350823 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.350844 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:47Z","lastTransitionTime":"2026-02-17T09:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.376075 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 07:15:51.817634731 +0000 UTC Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.382486 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:47 crc kubenswrapper[4848]: E0217 09:06:47.382800 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.382819 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:47 crc kubenswrapper[4848]: E0217 09:06:47.383021 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.382881 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.382501 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:47 crc kubenswrapper[4848]: E0217 09:06:47.383199 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:47 crc kubenswrapper[4848]: E0217 09:06:47.383345 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.454729 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.454990 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.455064 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.455103 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.455128 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:47Z","lastTransitionTime":"2026-02-17T09:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.558852 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.558933 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.558962 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.558999 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.559028 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:47Z","lastTransitionTime":"2026-02-17T09:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.662999 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.663383 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.663398 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.663420 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.663435 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:47Z","lastTransitionTime":"2026-02-17T09:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.766718 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.766814 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.766828 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.766852 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.766869 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:47Z","lastTransitionTime":"2026-02-17T09:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.870077 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.870124 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.870133 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.870173 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.870186 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:47Z","lastTransitionTime":"2026-02-17T09:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.973622 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.973787 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.973810 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.973879 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:47 crc kubenswrapper[4848]: I0217 09:06:47.973899 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:47Z","lastTransitionTime":"2026-02-17T09:06:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.077232 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.077300 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.077324 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.077356 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.077379 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:48Z","lastTransitionTime":"2026-02-17T09:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.179433 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.179486 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.179497 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.179515 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.179526 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:48Z","lastTransitionTime":"2026-02-17T09:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.283224 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.283290 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.283316 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.283346 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.283370 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:48Z","lastTransitionTime":"2026-02-17T09:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.376861 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 00:58:43.378960975 +0000 UTC Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.386740 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.386831 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.386849 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.386878 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.386898 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:48Z","lastTransitionTime":"2026-02-17T09:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.489833 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.489873 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.489882 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.489899 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.489912 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:48Z","lastTransitionTime":"2026-02-17T09:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.592857 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.592921 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.592937 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.592960 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.592978 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:48Z","lastTransitionTime":"2026-02-17T09:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.696639 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.696703 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.696721 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.696857 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.696878 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:48Z","lastTransitionTime":"2026-02-17T09:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.800828 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.800928 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.800947 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.800971 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.800989 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:48Z","lastTransitionTime":"2026-02-17T09:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.906385 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.906436 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.906454 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.906476 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:48 crc kubenswrapper[4848]: I0217 09:06:48.906493 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:48Z","lastTransitionTime":"2026-02-17T09:06:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.009753 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.009837 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.009853 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.009880 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.009901 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:49Z","lastTransitionTime":"2026-02-17T09:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.113297 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.113363 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.113386 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.113418 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.113444 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:49Z","lastTransitionTime":"2026-02-17T09:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.216458 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.216532 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.216555 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.216583 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.216607 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:49Z","lastTransitionTime":"2026-02-17T09:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.319850 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.319931 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.319950 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.319974 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.319992 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:49Z","lastTransitionTime":"2026-02-17T09:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.378119 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 19:17:04.993186922 +0000 UTC Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.382406 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.382493 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.382538 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.382812 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:49 crc kubenswrapper[4848]: E0217 09:06:49.382750 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:49 crc kubenswrapper[4848]: E0217 09:06:49.382913 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:49 crc kubenswrapper[4848]: E0217 09:06:49.383036 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:49 crc kubenswrapper[4848]: E0217 09:06:49.383320 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.422325 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.422416 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.422429 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.422448 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.422460 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:49Z","lastTransitionTime":"2026-02-17T09:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.525330 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.525384 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.525396 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.525414 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.525429 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:49Z","lastTransitionTime":"2026-02-17T09:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.628399 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.628438 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.628448 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.628464 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.628476 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:49Z","lastTransitionTime":"2026-02-17T09:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.731385 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.731425 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.731435 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.731450 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.731464 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:49Z","lastTransitionTime":"2026-02-17T09:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.835092 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.835136 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.835148 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.835167 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.835179 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:49Z","lastTransitionTime":"2026-02-17T09:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.939713 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.939745 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.939755 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.939784 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:49 crc kubenswrapper[4848]: I0217 09:06:49.939793 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:49Z","lastTransitionTime":"2026-02-17T09:06:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.042558 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.042602 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.042617 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.042636 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.042652 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:50Z","lastTransitionTime":"2026-02-17T09:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.146548 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.146630 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.146659 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.146691 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.146715 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:50Z","lastTransitionTime":"2026-02-17T09:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.250212 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.250253 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.250263 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.250283 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.250293 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:50Z","lastTransitionTime":"2026-02-17T09:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.352895 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.352933 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.352944 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.352960 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.352969 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:50Z","lastTransitionTime":"2026-02-17T09:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.378738 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 08:34:28.41725845 +0000 UTC Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.455946 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.455979 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.455987 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.456003 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.456012 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:50Z","lastTransitionTime":"2026-02-17T09:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.559159 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.559227 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.559247 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.559271 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.559290 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:50Z","lastTransitionTime":"2026-02-17T09:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.662481 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.662516 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.662551 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.662568 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.662580 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:50Z","lastTransitionTime":"2026-02-17T09:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.766738 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.766840 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.766861 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.766892 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.766912 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:50Z","lastTransitionTime":"2026-02-17T09:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.870376 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.870434 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.870451 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.870478 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.870496 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:50Z","lastTransitionTime":"2026-02-17T09:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.881391 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.881433 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.881444 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.881460 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.881477 4848 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-17T09:06:50Z","lastTransitionTime":"2026-02-17T09:06:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.939377 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-bq6vt"] Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.940154 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bq6vt" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.942802 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.942836 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.943325 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.943679 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.959795 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/713b3ae2-693a-455a-83e7-274aaf1e0a42-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bq6vt\" (UID: \"713b3ae2-693a-455a-83e7-274aaf1e0a42\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bq6vt" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.959877 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/713b3ae2-693a-455a-83e7-274aaf1e0a42-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bq6vt\" (UID: \"713b3ae2-693a-455a-83e7-274aaf1e0a42\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bq6vt" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.960011 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/713b3ae2-693a-455a-83e7-274aaf1e0a42-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bq6vt\" (UID: \"713b3ae2-693a-455a-83e7-274aaf1e0a42\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bq6vt" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.960090 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/713b3ae2-693a-455a-83e7-274aaf1e0a42-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bq6vt\" (UID: \"713b3ae2-693a-455a-83e7-274aaf1e0a42\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bq6vt" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.960162 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/713b3ae2-693a-455a-83e7-274aaf1e0a42-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bq6vt\" (UID: \"713b3ae2-693a-455a-83e7-274aaf1e0a42\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bq6vt" Feb 17 09:06:50 crc kubenswrapper[4848]: I0217 09:06:50.997857 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-6rgmx" podStartSLOduration=63.997832158 podStartE2EDuration="1m3.997832158s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:06:50.996174283 +0000 UTC m=+88.539429939" watchObservedRunningTime="2026-02-17 09:06:50.997832158 +0000 UTC m=+88.541087814" Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.015548 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=40.015523992 podStartE2EDuration="40.015523992s" podCreationTimestamp="2026-02-17 09:06:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:06:51.01544893 +0000 UTC m=+88.558704616" watchObservedRunningTime="2026-02-17 09:06:51.015523992 +0000 UTC m=+88.558779648" Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.059801 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=68.059735771 podStartE2EDuration="1m8.059735771s" podCreationTimestamp="2026-02-17 09:05:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:06:51.041588154 +0000 UTC m=+88.584843840" watchObservedRunningTime="2026-02-17 09:06:51.059735771 +0000 UTC m=+88.602991457" Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.061116 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/713b3ae2-693a-455a-83e7-274aaf1e0a42-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bq6vt\" (UID: \"713b3ae2-693a-455a-83e7-274aaf1e0a42\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bq6vt" Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.061192 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/713b3ae2-693a-455a-83e7-274aaf1e0a42-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bq6vt\" (UID: \"713b3ae2-693a-455a-83e7-274aaf1e0a42\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bq6vt" Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.061252 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/713b3ae2-693a-455a-83e7-274aaf1e0a42-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bq6vt\" (UID: \"713b3ae2-693a-455a-83e7-274aaf1e0a42\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bq6vt" Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.061269 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/713b3ae2-693a-455a-83e7-274aaf1e0a42-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bq6vt\" (UID: \"713b3ae2-693a-455a-83e7-274aaf1e0a42\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bq6vt" Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.061312 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/713b3ae2-693a-455a-83e7-274aaf1e0a42-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bq6vt\" (UID: \"713b3ae2-693a-455a-83e7-274aaf1e0a42\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bq6vt" Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.061364 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/713b3ae2-693a-455a-83e7-274aaf1e0a42-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bq6vt\" (UID: \"713b3ae2-693a-455a-83e7-274aaf1e0a42\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bq6vt" Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.061549 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/713b3ae2-693a-455a-83e7-274aaf1e0a42-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bq6vt\" (UID: \"713b3ae2-693a-455a-83e7-274aaf1e0a42\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bq6vt" Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.063065 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/713b3ae2-693a-455a-83e7-274aaf1e0a42-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bq6vt\" (UID: \"713b3ae2-693a-455a-83e7-274aaf1e0a42\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bq6vt" Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.073998 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/713b3ae2-693a-455a-83e7-274aaf1e0a42-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bq6vt\" (UID: \"713b3ae2-693a-455a-83e7-274aaf1e0a42\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bq6vt" Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.093706 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/713b3ae2-693a-455a-83e7-274aaf1e0a42-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bq6vt\" (UID: \"713b3ae2-693a-455a-83e7-274aaf1e0a42\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bq6vt" Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.116563 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podStartSLOduration=64.116536684 podStartE2EDuration="1m4.116536684s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:06:51.116058311 +0000 UTC m=+88.659313997" watchObservedRunningTime="2026-02-17 09:06:51.116536684 +0000 UTC m=+88.659792370" Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.117043 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=7.117034807 podStartE2EDuration="7.117034807s" podCreationTimestamp="2026-02-17 09:06:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:06:51.102316105 +0000 UTC m=+88.645571801" watchObservedRunningTime="2026-02-17 09:06:51.117034807 +0000 UTC m=+88.660290483" Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.207236 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-gv2xs" podStartSLOduration=65.207213263 podStartE2EDuration="1m5.207213263s" podCreationTimestamp="2026-02-17 09:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:06:51.160142866 +0000 UTC m=+88.703398532" watchObservedRunningTime="2026-02-17 09:06:51.207213263 +0000 UTC m=+88.750468919" Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.238261 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-t94zv" podStartSLOduration=64.238240822 podStartE2EDuration="1m4.238240822s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:06:51.22319008 +0000 UTC m=+88.766445736" watchObservedRunningTime="2026-02-17 09:06:51.238240822 +0000 UTC m=+88.781496468" Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.259179 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bq6vt" Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.268463 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-65rnv" podStartSLOduration=64.268443718 podStartE2EDuration="1m4.268443718s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:06:51.268389126 +0000 UTC m=+88.811644782" watchObservedRunningTime="2026-02-17 09:06:51.268443718 +0000 UTC m=+88.811699364" Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.307040 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=61.307019532 podStartE2EDuration="1m1.307019532s" podCreationTimestamp="2026-02-17 09:05:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:06:51.306685683 +0000 UTC m=+88.849941319" watchObservedRunningTime="2026-02-17 09:06:51.307019532 +0000 UTC m=+88.850275178" Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.334612 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-fn2gp" podStartSLOduration=64.334594317 podStartE2EDuration="1m4.334594317s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:06:51.333523087 +0000 UTC m=+88.876778733" watchObservedRunningTime="2026-02-17 09:06:51.334594317 +0000 UTC m=+88.877849963" Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.379595 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 16:08:26.325069938 +0000 UTC Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.379655 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.383449 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:51 crc kubenswrapper[4848]: E0217 09:06:51.383564 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.383644 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:51 crc kubenswrapper[4848]: E0217 09:06:51.383706 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.383984 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:51 crc kubenswrapper[4848]: E0217 09:06:51.384039 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.384146 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:51 crc kubenswrapper[4848]: E0217 09:06:51.384187 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.387257 4848 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.952977 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bq6vt" event={"ID":"713b3ae2-693a-455a-83e7-274aaf1e0a42","Type":"ContainerStarted","Data":"a1b028b5979695edb8fcf90818737ea65f722a6e7c49505e56d963853f7199e6"} Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.953045 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bq6vt" event={"ID":"713b3ae2-693a-455a-83e7-274aaf1e0a42","Type":"ContainerStarted","Data":"8ac222ff667adeb589061c40306982b694767e81f03a42937ebf5fec94d38b6e"} Feb 17 09:06:51 crc kubenswrapper[4848]: I0217 09:06:51.975917 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bq6vt" podStartSLOduration=64.975888801 podStartE2EDuration="1m4.975888801s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:06:51.974330079 +0000 UTC m=+89.517585765" watchObservedRunningTime="2026-02-17 09:06:51.975888801 +0000 UTC m=+89.519144487" Feb 17 09:06:53 crc kubenswrapper[4848]: I0217 09:06:53.382664 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:53 crc kubenswrapper[4848]: I0217 09:06:53.382720 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:53 crc kubenswrapper[4848]: I0217 09:06:53.382793 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:53 crc kubenswrapper[4848]: E0217 09:06:53.383721 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:53 crc kubenswrapper[4848]: I0217 09:06:53.383808 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:53 crc kubenswrapper[4848]: E0217 09:06:53.383887 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:53 crc kubenswrapper[4848]: E0217 09:06:53.384045 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:53 crc kubenswrapper[4848]: E0217 09:06:53.384144 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:55 crc kubenswrapper[4848]: I0217 09:06:55.383107 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:55 crc kubenswrapper[4848]: I0217 09:06:55.383189 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:55 crc kubenswrapper[4848]: I0217 09:06:55.383111 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:55 crc kubenswrapper[4848]: I0217 09:06:55.383105 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:55 crc kubenswrapper[4848]: E0217 09:06:55.383263 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:55 crc kubenswrapper[4848]: E0217 09:06:55.383418 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:55 crc kubenswrapper[4848]: E0217 09:06:55.383551 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:55 crc kubenswrapper[4848]: E0217 09:06:55.383706 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:57 crc kubenswrapper[4848]: I0217 09:06:57.383130 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:57 crc kubenswrapper[4848]: I0217 09:06:57.383191 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:57 crc kubenswrapper[4848]: I0217 09:06:57.383364 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:57 crc kubenswrapper[4848]: E0217 09:06:57.384745 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:57 crc kubenswrapper[4848]: I0217 09:06:57.384813 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:57 crc kubenswrapper[4848]: E0217 09:06:57.385001 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:06:57 crc kubenswrapper[4848]: E0217 09:06:57.385557 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:57 crc kubenswrapper[4848]: E0217 09:06:57.385558 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:57 crc kubenswrapper[4848]: I0217 09:06:57.386377 4848 scope.go:117] "RemoveContainer" containerID="fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624" Feb 17 09:06:57 crc kubenswrapper[4848]: E0217 09:06:57.386709 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4fvgf_openshift-ovn-kubernetes(2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" Feb 17 09:06:57 crc kubenswrapper[4848]: I0217 09:06:57.405278 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 17 09:06:59 crc kubenswrapper[4848]: I0217 09:06:59.383438 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:06:59 crc kubenswrapper[4848]: I0217 09:06:59.383489 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:06:59 crc kubenswrapper[4848]: I0217 09:06:59.383547 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:06:59 crc kubenswrapper[4848]: E0217 09:06:59.383706 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:06:59 crc kubenswrapper[4848]: I0217 09:06:59.383752 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:06:59 crc kubenswrapper[4848]: E0217 09:06:59.384071 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:06:59 crc kubenswrapper[4848]: E0217 09:06:59.384147 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:06:59 crc kubenswrapper[4848]: E0217 09:06:59.384201 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:07:01 crc kubenswrapper[4848]: I0217 09:07:01.382909 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:07:01 crc kubenswrapper[4848]: E0217 09:07:01.383388 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:07:01 crc kubenswrapper[4848]: I0217 09:07:01.382943 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:07:01 crc kubenswrapper[4848]: I0217 09:07:01.383085 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:07:01 crc kubenswrapper[4848]: E0217 09:07:01.383594 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:07:01 crc kubenswrapper[4848]: I0217 09:07:01.383003 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:07:01 crc kubenswrapper[4848]: E0217 09:07:01.383820 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:07:01 crc kubenswrapper[4848]: E0217 09:07:01.383811 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:07:03 crc kubenswrapper[4848]: I0217 09:07:03.382336 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:07:03 crc kubenswrapper[4848]: I0217 09:07:03.382407 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:07:03 crc kubenswrapper[4848]: I0217 09:07:03.382430 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:07:03 crc kubenswrapper[4848]: E0217 09:07:03.384479 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:07:03 crc kubenswrapper[4848]: I0217 09:07:03.384505 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:07:03 crc kubenswrapper[4848]: E0217 09:07:03.384615 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:07:03 crc kubenswrapper[4848]: E0217 09:07:03.386067 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:07:03 crc kubenswrapper[4848]: E0217 09:07:03.386311 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:07:03 crc kubenswrapper[4848]: I0217 09:07:03.405055 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=6.405031859 podStartE2EDuration="6.405031859s" podCreationTimestamp="2026-02-17 09:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:03.403825276 +0000 UTC m=+100.947080952" watchObservedRunningTime="2026-02-17 09:07:03.405031859 +0000 UTC m=+100.948287535" Feb 17 09:07:04 crc kubenswrapper[4848]: I0217 09:07:04.514487 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:07:04 crc kubenswrapper[4848]: I0217 09:07:04.515790 4848 scope.go:117] "RemoveContainer" containerID="fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624" Feb 17 09:07:04 crc kubenswrapper[4848]: E0217 09:07:04.516044 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4fvgf_openshift-ovn-kubernetes(2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" Feb 17 09:07:05 crc kubenswrapper[4848]: I0217 09:07:05.382754 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:07:05 crc kubenswrapper[4848]: I0217 09:07:05.382864 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:07:05 crc kubenswrapper[4848]: I0217 09:07:05.382746 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:07:05 crc kubenswrapper[4848]: E0217 09:07:05.383040 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:07:05 crc kubenswrapper[4848]: E0217 09:07:05.383304 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:07:05 crc kubenswrapper[4848]: E0217 09:07:05.383214 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:07:05 crc kubenswrapper[4848]: I0217 09:07:05.384098 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:07:05 crc kubenswrapper[4848]: E0217 09:07:05.384371 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:07:06 crc kubenswrapper[4848]: I0217 09:07:06.028746 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs\") pod \"network-metrics-daemon-78r6x\" (UID: \"98bfddd8-4a1a-4b90-973a-adb75b02fdba\") " pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:07:06 crc kubenswrapper[4848]: E0217 09:07:06.029522 4848 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 09:07:06 crc kubenswrapper[4848]: E0217 09:07:06.029664 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs podName:98bfddd8-4a1a-4b90-973a-adb75b02fdba nodeName:}" failed. No retries permitted until 2026-02-17 09:08:10.029644207 +0000 UTC m=+167.572899853 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs") pod "network-metrics-daemon-78r6x" (UID: "98bfddd8-4a1a-4b90-973a-adb75b02fdba") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 17 09:07:07 crc kubenswrapper[4848]: I0217 09:07:07.382872 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:07:07 crc kubenswrapper[4848]: E0217 09:07:07.382991 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:07:07 crc kubenswrapper[4848]: I0217 09:07:07.382887 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:07:07 crc kubenswrapper[4848]: I0217 09:07:07.383075 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:07:07 crc kubenswrapper[4848]: E0217 09:07:07.383204 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:07:07 crc kubenswrapper[4848]: E0217 09:07:07.383099 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:07:07 crc kubenswrapper[4848]: I0217 09:07:07.383620 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:07:07 crc kubenswrapper[4848]: E0217 09:07:07.383818 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:07:09 crc kubenswrapper[4848]: I0217 09:07:09.382658 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:07:09 crc kubenswrapper[4848]: I0217 09:07:09.382721 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:07:09 crc kubenswrapper[4848]: I0217 09:07:09.382918 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:07:09 crc kubenswrapper[4848]: E0217 09:07:09.383124 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:07:09 crc kubenswrapper[4848]: I0217 09:07:09.383476 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:07:09 crc kubenswrapper[4848]: E0217 09:07:09.383605 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:07:09 crc kubenswrapper[4848]: E0217 09:07:09.383918 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:07:09 crc kubenswrapper[4848]: E0217 09:07:09.384142 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:07:11 crc kubenswrapper[4848]: I0217 09:07:11.385296 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:07:11 crc kubenswrapper[4848]: I0217 09:07:11.385348 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:07:11 crc kubenswrapper[4848]: I0217 09:07:11.385305 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:07:11 crc kubenswrapper[4848]: E0217 09:07:11.385487 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:07:11 crc kubenswrapper[4848]: I0217 09:07:11.385574 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:07:11 crc kubenswrapper[4848]: E0217 09:07:11.385626 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:07:11 crc kubenswrapper[4848]: E0217 09:07:11.385712 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:07:11 crc kubenswrapper[4848]: E0217 09:07:11.385846 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:07:13 crc kubenswrapper[4848]: I0217 09:07:13.383114 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:07:13 crc kubenswrapper[4848]: I0217 09:07:13.383182 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:07:13 crc kubenswrapper[4848]: I0217 09:07:13.383931 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:07:13 crc kubenswrapper[4848]: E0217 09:07:13.385139 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:07:13 crc kubenswrapper[4848]: I0217 09:07:13.385198 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:07:13 crc kubenswrapper[4848]: E0217 09:07:13.385312 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:07:13 crc kubenswrapper[4848]: E0217 09:07:13.385523 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:07:13 crc kubenswrapper[4848]: E0217 09:07:13.385631 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:07:15 crc kubenswrapper[4848]: I0217 09:07:15.383425 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:07:15 crc kubenswrapper[4848]: I0217 09:07:15.383490 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:07:15 crc kubenswrapper[4848]: I0217 09:07:15.383454 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:07:15 crc kubenswrapper[4848]: I0217 09:07:15.383633 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:07:15 crc kubenswrapper[4848]: E0217 09:07:15.383739 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:07:15 crc kubenswrapper[4848]: E0217 09:07:15.385029 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:07:15 crc kubenswrapper[4848]: E0217 09:07:15.385229 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:07:15 crc kubenswrapper[4848]: E0217 09:07:15.385501 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:07:15 crc kubenswrapper[4848]: I0217 09:07:15.385647 4848 scope.go:117] "RemoveContainer" containerID="fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624" Feb 17 09:07:15 crc kubenswrapper[4848]: E0217 09:07:15.385979 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-4fvgf_openshift-ovn-kubernetes(2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8)\"" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" Feb 17 09:07:17 crc kubenswrapper[4848]: I0217 09:07:17.383164 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:07:17 crc kubenswrapper[4848]: I0217 09:07:17.384117 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:07:17 crc kubenswrapper[4848]: I0217 09:07:17.384186 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:07:17 crc kubenswrapper[4848]: E0217 09:07:17.384342 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:07:17 crc kubenswrapper[4848]: I0217 09:07:17.384414 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:07:17 crc kubenswrapper[4848]: E0217 09:07:17.384493 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:07:17 crc kubenswrapper[4848]: E0217 09:07:17.384616 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:07:17 crc kubenswrapper[4848]: E0217 09:07:17.384850 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:07:19 crc kubenswrapper[4848]: I0217 09:07:19.387509 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:07:19 crc kubenswrapper[4848]: I0217 09:07:19.387690 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:07:19 crc kubenswrapper[4848]: E0217 09:07:19.387687 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:07:19 crc kubenswrapper[4848]: I0217 09:07:19.387737 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:07:19 crc kubenswrapper[4848]: I0217 09:07:19.387799 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:07:19 crc kubenswrapper[4848]: E0217 09:07:19.387906 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:07:19 crc kubenswrapper[4848]: E0217 09:07:19.388001 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:07:19 crc kubenswrapper[4848]: E0217 09:07:19.388083 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:07:21 crc kubenswrapper[4848]: I0217 09:07:21.058902 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6rgmx_ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6/kube-multus/1.log" Feb 17 09:07:21 crc kubenswrapper[4848]: I0217 09:07:21.059532 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6rgmx_ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6/kube-multus/0.log" Feb 17 09:07:21 crc kubenswrapper[4848]: I0217 09:07:21.059617 4848 generic.go:334] "Generic (PLEG): container finished" podID="ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6" containerID="819c45bf58d7144faff1ddfb973187d2fa7bced89ad4d5d32ad750d5dc4f39ab" exitCode=1 Feb 17 09:07:21 crc kubenswrapper[4848]: I0217 09:07:21.059679 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6rgmx" event={"ID":"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6","Type":"ContainerDied","Data":"819c45bf58d7144faff1ddfb973187d2fa7bced89ad4d5d32ad750d5dc4f39ab"} Feb 17 09:07:21 crc kubenswrapper[4848]: I0217 09:07:21.059789 4848 scope.go:117] "RemoveContainer" containerID="294f1e4dfee18fe3e2ed81b5638e50ffc99eba99b040a42f338025243f0980db" Feb 17 09:07:21 crc kubenswrapper[4848]: I0217 09:07:21.060706 4848 scope.go:117] "RemoveContainer" containerID="819c45bf58d7144faff1ddfb973187d2fa7bced89ad4d5d32ad750d5dc4f39ab" Feb 17 09:07:21 crc kubenswrapper[4848]: E0217 09:07:21.061054 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-6rgmx_openshift-multus(ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6)\"" pod="openshift-multus/multus-6rgmx" podUID="ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6" Feb 17 09:07:21 crc kubenswrapper[4848]: I0217 09:07:21.382753 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:07:21 crc kubenswrapper[4848]: I0217 09:07:21.382798 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:07:21 crc kubenswrapper[4848]: I0217 09:07:21.382875 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:07:21 crc kubenswrapper[4848]: I0217 09:07:21.382816 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:07:21 crc kubenswrapper[4848]: E0217 09:07:21.383337 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:07:21 crc kubenswrapper[4848]: E0217 09:07:21.383451 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:07:21 crc kubenswrapper[4848]: E0217 09:07:21.383561 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:07:21 crc kubenswrapper[4848]: E0217 09:07:21.383711 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:07:22 crc kubenswrapper[4848]: I0217 09:07:22.066318 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6rgmx_ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6/kube-multus/1.log" Feb 17 09:07:23 crc kubenswrapper[4848]: E0217 09:07:23.379186 4848 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 17 09:07:23 crc kubenswrapper[4848]: I0217 09:07:23.382833 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:07:23 crc kubenswrapper[4848]: I0217 09:07:23.382911 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:07:23 crc kubenswrapper[4848]: I0217 09:07:23.382933 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:07:23 crc kubenswrapper[4848]: E0217 09:07:23.384031 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:07:23 crc kubenswrapper[4848]: I0217 09:07:23.384064 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:07:23 crc kubenswrapper[4848]: E0217 09:07:23.384224 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:07:23 crc kubenswrapper[4848]: E0217 09:07:23.384388 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:07:23 crc kubenswrapper[4848]: E0217 09:07:23.384485 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:07:23 crc kubenswrapper[4848]: E0217 09:07:23.464343 4848 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 09:07:25 crc kubenswrapper[4848]: I0217 09:07:25.383194 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:07:25 crc kubenswrapper[4848]: I0217 09:07:25.383249 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:07:25 crc kubenswrapper[4848]: I0217 09:07:25.383194 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:07:25 crc kubenswrapper[4848]: I0217 09:07:25.383316 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:07:25 crc kubenswrapper[4848]: E0217 09:07:25.383439 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:07:25 crc kubenswrapper[4848]: E0217 09:07:25.383520 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:07:25 crc kubenswrapper[4848]: E0217 09:07:25.383661 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:07:25 crc kubenswrapper[4848]: E0217 09:07:25.383815 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:07:26 crc kubenswrapper[4848]: I0217 09:07:26.384025 4848 scope.go:117] "RemoveContainer" containerID="fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624" Feb 17 09:07:27 crc kubenswrapper[4848]: I0217 09:07:27.086225 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4fvgf_2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8/ovnkube-controller/3.log" Feb 17 09:07:27 crc kubenswrapper[4848]: I0217 09:07:27.088680 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerStarted","Data":"50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b"} Feb 17 09:07:27 crc kubenswrapper[4848]: I0217 09:07:27.089401 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:07:27 crc kubenswrapper[4848]: I0217 09:07:27.131285 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" podStartSLOduration=100.131261375 podStartE2EDuration="1m40.131261375s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:27.130554596 +0000 UTC m=+124.673810272" watchObservedRunningTime="2026-02-17 09:07:27.131261375 +0000 UTC m=+124.674517051" Feb 17 09:07:27 crc kubenswrapper[4848]: I0217 09:07:27.398467 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:07:27 crc kubenswrapper[4848]: E0217 09:07:27.398695 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:07:27 crc kubenswrapper[4848]: I0217 09:07:27.398821 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:07:27 crc kubenswrapper[4848]: I0217 09:07:27.398857 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:07:27 crc kubenswrapper[4848]: I0217 09:07:27.398881 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:07:27 crc kubenswrapper[4848]: E0217 09:07:27.398985 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:07:27 crc kubenswrapper[4848]: E0217 09:07:27.399104 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:07:27 crc kubenswrapper[4848]: E0217 09:07:27.399265 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:07:27 crc kubenswrapper[4848]: I0217 09:07:27.406411 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-78r6x"] Feb 17 09:07:28 crc kubenswrapper[4848]: I0217 09:07:28.091840 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:07:28 crc kubenswrapper[4848]: E0217 09:07:28.092009 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:07:28 crc kubenswrapper[4848]: E0217 09:07:28.466214 4848 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 09:07:29 crc kubenswrapper[4848]: I0217 09:07:29.383420 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:07:29 crc kubenswrapper[4848]: I0217 09:07:29.383534 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:07:29 crc kubenswrapper[4848]: E0217 09:07:29.384055 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:07:29 crc kubenswrapper[4848]: I0217 09:07:29.383536 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:07:29 crc kubenswrapper[4848]: E0217 09:07:29.384205 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:07:29 crc kubenswrapper[4848]: E0217 09:07:29.384481 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:07:30 crc kubenswrapper[4848]: I0217 09:07:30.383302 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:07:30 crc kubenswrapper[4848]: E0217 09:07:30.383442 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:07:31 crc kubenswrapper[4848]: I0217 09:07:31.383258 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:07:31 crc kubenswrapper[4848]: I0217 09:07:31.383297 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:07:31 crc kubenswrapper[4848]: I0217 09:07:31.383354 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:07:31 crc kubenswrapper[4848]: E0217 09:07:31.383454 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:07:31 crc kubenswrapper[4848]: E0217 09:07:31.383579 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:07:31 crc kubenswrapper[4848]: E0217 09:07:31.383693 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:07:32 crc kubenswrapper[4848]: I0217 09:07:32.382935 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:07:32 crc kubenswrapper[4848]: E0217 09:07:32.383122 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:07:33 crc kubenswrapper[4848]: I0217 09:07:33.382847 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:07:33 crc kubenswrapper[4848]: I0217 09:07:33.382904 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:07:33 crc kubenswrapper[4848]: I0217 09:07:33.382939 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:07:33 crc kubenswrapper[4848]: E0217 09:07:33.383984 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:07:33 crc kubenswrapper[4848]: E0217 09:07:33.384155 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:07:33 crc kubenswrapper[4848]: E0217 09:07:33.384211 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:07:33 crc kubenswrapper[4848]: E0217 09:07:33.467114 4848 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 17 09:07:34 crc kubenswrapper[4848]: I0217 09:07:34.382970 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:07:34 crc kubenswrapper[4848]: E0217 09:07:34.383279 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:07:34 crc kubenswrapper[4848]: I0217 09:07:34.529899 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:07:35 crc kubenswrapper[4848]: I0217 09:07:35.382734 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:07:35 crc kubenswrapper[4848]: I0217 09:07:35.382793 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:07:35 crc kubenswrapper[4848]: I0217 09:07:35.382866 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:07:35 crc kubenswrapper[4848]: E0217 09:07:35.382871 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:07:35 crc kubenswrapper[4848]: E0217 09:07:35.382987 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:07:35 crc kubenswrapper[4848]: E0217 09:07:35.383055 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:07:36 crc kubenswrapper[4848]: I0217 09:07:36.383064 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:07:36 crc kubenswrapper[4848]: E0217 09:07:36.383281 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:07:36 crc kubenswrapper[4848]: I0217 09:07:36.383750 4848 scope.go:117] "RemoveContainer" containerID="819c45bf58d7144faff1ddfb973187d2fa7bced89ad4d5d32ad750d5dc4f39ab" Feb 17 09:07:37 crc kubenswrapper[4848]: I0217 09:07:37.124817 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6rgmx_ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6/kube-multus/1.log" Feb 17 09:07:37 crc kubenswrapper[4848]: I0217 09:07:37.125105 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6rgmx" event={"ID":"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6","Type":"ContainerStarted","Data":"c2340b994635fb61af0accf68fe141fcf64aaa0467d09e7a83fdf9cbd2ea65ed"} Feb 17 09:07:37 crc kubenswrapper[4848]: I0217 09:07:37.382753 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:07:37 crc kubenswrapper[4848]: I0217 09:07:37.382848 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:07:37 crc kubenswrapper[4848]: E0217 09:07:37.382924 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 17 09:07:37 crc kubenswrapper[4848]: I0217 09:07:37.382752 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:07:37 crc kubenswrapper[4848]: E0217 09:07:37.383028 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 17 09:07:37 crc kubenswrapper[4848]: E0217 09:07:37.383172 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 17 09:07:38 crc kubenswrapper[4848]: I0217 09:07:38.383222 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:07:38 crc kubenswrapper[4848]: E0217 09:07:38.383404 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-78r6x" podUID="98bfddd8-4a1a-4b90-973a-adb75b02fdba" Feb 17 09:07:39 crc kubenswrapper[4848]: I0217 09:07:39.382942 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:07:39 crc kubenswrapper[4848]: I0217 09:07:39.382996 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:07:39 crc kubenswrapper[4848]: I0217 09:07:39.383235 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:07:39 crc kubenswrapper[4848]: I0217 09:07:39.385664 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 17 09:07:39 crc kubenswrapper[4848]: I0217 09:07:39.385911 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 17 09:07:39 crc kubenswrapper[4848]: I0217 09:07:39.387886 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 17 09:07:39 crc kubenswrapper[4848]: I0217 09:07:39.388986 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 17 09:07:40 crc kubenswrapper[4848]: I0217 09:07:40.382855 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:07:40 crc kubenswrapper[4848]: I0217 09:07:40.385712 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 17 09:07:40 crc kubenswrapper[4848]: I0217 09:07:40.387484 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.180394 4848 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.228640 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-2b8xg"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.229305 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2b8xg" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.236652 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.238208 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.238251 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.238442 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.241671 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.242508 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.260874 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fqhrm"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.261588 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.261913 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-fqhrm" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.261933 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.267667 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-b8gqs"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.270189 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9zhhp"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.270574 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-9zhhp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.270630 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.273134 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.273412 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.273531 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.273610 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.273773 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.273850 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.273947 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.273946 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.274074 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.274198 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.274380 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.276159 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xzdww"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.276751 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.277984 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ph5wk"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.278241 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.278909 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ph5wk" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.279487 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.280518 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.280902 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.281025 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.281088 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.281253 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.281375 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.283986 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7b6g5"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.284474 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-9h2hf"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.284749 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-48d4r"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.285125 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-48d4r" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.285818 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.286083 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7b6g5" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.287199 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.287459 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.287587 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.287625 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.287680 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.287815 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.287875 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.287935 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.289921 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.290224 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.290399 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.290665 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.291144 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.291354 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.291894 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.292134 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.292192 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.292267 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.292367 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.292383 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.292463 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.292489 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.292564 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.292579 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.292633 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.292666 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.292703 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.292782 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.292792 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.292859 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.292864 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.292926 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.292991 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zhzdb"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.293536 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhzdb" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.294700 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.295163 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.295317 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.296493 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtkxw"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.299302 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bmjgh"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.301537 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6xbj4"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.301824 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-7vmx8"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.301877 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.302111 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wks4t"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.302199 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.302382 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.311258 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wks4t" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.311462 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-7vmx8" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.312088 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtkxw" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.312798 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bmjgh" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.315742 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.318009 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t68jc"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.319133 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t68jc" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.323209 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.323355 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.323370 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-gxh7z"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.323469 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.323664 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.323821 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.323977 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.324049 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.324160 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.324692 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.324910 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.324941 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4cw7d"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.325532 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.325665 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.325796 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.325985 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.325524 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gxh7z" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.333928 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.343743 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.344162 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vggbp"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.344749 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vggbp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.345273 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.347730 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4cw7d" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.348163 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.348469 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.348642 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.350072 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.354127 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.355359 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.356019 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.356857 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4cgbp"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.357514 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4cgbp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.357706 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-47hsw"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.360247 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-d4p2l"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.360562 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.360806 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wlh58"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.361137 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c7jxs"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.361522 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.361570 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c7jxs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.361139 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-47hsw" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.361917 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d4p2l" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.362085 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wlh58" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.368175 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4hfc2"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.368390 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.368695 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.368941 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.368986 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.369120 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.369179 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.369124 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.369370 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.369385 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.369447 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.369459 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.371497 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-6rbxt"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.372029 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-6rbxt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.372920 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1ebf9d1e-e313-440d-992a-9e0ede5b2b24-images\") pod \"machine-api-operator-5694c8668f-fqhrm\" (UID: \"1ebf9d1e-e313-440d-992a-9e0ede5b2b24\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fqhrm" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.372988 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ebf9d1e-e313-440d-992a-9e0ede5b2b24-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fqhrm\" (UID: \"1ebf9d1e-e313-440d-992a-9e0ede5b2b24\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fqhrm" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373022 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t97g5\" (UniqueName: \"kubernetes.io/projected/575767dd-6121-4745-aae9-c5434aee72d5-kube-api-access-t97g5\") pod \"downloads-7954f5f757-7vmx8\" (UID: \"575767dd-6121-4745-aae9-c5434aee72d5\") " pod="openshift-console/downloads-7954f5f757-7vmx8" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373077 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/038c87fd-172d-4e34-8927-92bafb47879a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7b6g5\" (UID: \"038c87fd-172d-4e34-8927-92bafb47879a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7b6g5" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373100 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/414280b4-3299-4fee-a33c-a231b66000c7-service-ca-bundle\") pod \"router-default-5444994796-gxh7z\" (UID: \"414280b4-3299-4fee-a33c-a231b66000c7\") " pod="openshift-ingress/router-default-5444994796-gxh7z" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373121 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71638eb4-fb1c-42be-84d6-d900ad27f196-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9zhhp\" (UID: \"71638eb4-fb1c-42be-84d6-d900ad27f196\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9zhhp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373143 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtt5h\" (UniqueName: \"kubernetes.io/projected/4baeac0b-e75b-49b5-9206-4cd99a4764f6-kube-api-access-jtt5h\") pod \"console-operator-58897d9998-48d4r\" (UID: \"4baeac0b-e75b-49b5-9206-4cd99a4764f6\") " pod="openshift-console-operator/console-operator-58897d9998-48d4r" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373165 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4134a970-5107-41d2-8fbe-336387a17b77-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-bmjgh\" (UID: \"4134a970-5107-41d2-8fbe-336387a17b77\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bmjgh" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373189 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rkfw\" (UniqueName: \"kubernetes.io/projected/b1482be7-a50a-43ca-974a-d49cad628e46-kube-api-access-9rkfw\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373215 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373239 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373260 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4134a970-5107-41d2-8fbe-336387a17b77-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-bmjgh\" (UID: \"4134a970-5107-41d2-8fbe-336387a17b77\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bmjgh" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373287 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/414280b4-3299-4fee-a33c-a231b66000c7-default-certificate\") pod \"router-default-5444994796-gxh7z\" (UID: \"414280b4-3299-4fee-a33c-a231b66000c7\") " pod="openshift-ingress/router-default-5444994796-gxh7z" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373308 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c546b6c9-bd0c-4a50-9521-49af89f3ede1-serving-cert\") pod \"openshift-config-operator-7777fb866f-zhzdb\" (UID: \"c546b6c9-bd0c-4a50-9521-49af89f3ede1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhzdb" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373328 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373348 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373373 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpt56\" (UniqueName: \"kubernetes.io/projected/a32bd13d-f885-4653-8868-89fc4a8ac111-kube-api-access-qpt56\") pod \"cluster-samples-operator-665b6dd947-ph5wk\" (UID: \"a32bd13d-f885-4653-8868-89fc4a8ac111\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ph5wk" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373398 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f7b4b17a-ac26-4e3a-8f67-d0c855b6aa95-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-4cw7d\" (UID: \"f7b4b17a-ac26-4e3a-8f67-d0c855b6aa95\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4cw7d" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373431 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvdxj\" (UniqueName: \"kubernetes.io/projected/79fb293c-379a-47ac-a9b1-23746cb0758e-kube-api-access-kvdxj\") pod \"apiserver-7bbb656c7d-zw6lz\" (UID: \"79fb293c-379a-47ac-a9b1-23746cb0758e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373454 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b1482be7-a50a-43ca-974a-d49cad628e46-node-pullsecrets\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373477 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/414280b4-3299-4fee-a33c-a231b66000c7-stats-auth\") pod \"router-default-5444994796-gxh7z\" (UID: \"414280b4-3299-4fee-a33c-a231b66000c7\") " pod="openshift-ingress/router-default-5444994796-gxh7z" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373500 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1482be7-a50a-43ca-974a-d49cad628e46-trusted-ca-bundle\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373522 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/981371a6-bce6-4fc3-a2ea-2f4c0e26072c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xtkxw\" (UID: \"981371a6-bce6-4fc3-a2ea-2f4c0e26072c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtkxw" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373545 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71638eb4-fb1c-42be-84d6-d900ad27f196-config\") pod \"authentication-operator-69f744f599-9zhhp\" (UID: \"71638eb4-fb1c-42be-84d6-d900ad27f196\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9zhhp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373571 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a-client-ca\") pod \"route-controller-manager-6576b87f9c-lwfvg\" (UID: \"2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373632 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0fba000-75c8-49be-945e-fc41fabf370c-config\") pod \"controller-manager-879f6c89f-6xbj4\" (UID: \"b0fba000-75c8-49be-945e-fc41fabf370c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373909 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q77d\" (UniqueName: \"kubernetes.io/projected/15500d79-d7c7-4a0c-965d-4783f4a85b2c-kube-api-access-7q77d\") pod \"migrator-59844c95c7-wks4t\" (UID: \"15500d79-d7c7-4a0c-965d-4783f4a85b2c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wks4t" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.373962 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b0fba000-75c8-49be-945e-fc41fabf370c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6xbj4\" (UID: \"b0fba000-75c8-49be-945e-fc41fabf370c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374012 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/79fb293c-379a-47ac-a9b1-23746cb0758e-audit-dir\") pod \"apiserver-7bbb656c7d-zw6lz\" (UID: \"79fb293c-379a-47ac-a9b1-23746cb0758e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374049 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374076 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwwpg\" (UniqueName: \"kubernetes.io/projected/4134a970-5107-41d2-8fbe-336387a17b77-kube-api-access-zwwpg\") pod \"cluster-image-registry-operator-dc59b4c8b-bmjgh\" (UID: \"4134a970-5107-41d2-8fbe-336387a17b77\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bmjgh" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374101 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/981371a6-bce6-4fc3-a2ea-2f4c0e26072c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xtkxw\" (UID: \"981371a6-bce6-4fc3-a2ea-2f4c0e26072c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtkxw" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374274 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65cgg\" (UniqueName: \"kubernetes.io/projected/414280b4-3299-4fee-a33c-a231b66000c7-kube-api-access-65cgg\") pod \"router-default-5444994796-gxh7z\" (UID: \"414280b4-3299-4fee-a33c-a231b66000c7\") " pod="openshift-ingress/router-default-5444994796-gxh7z" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374301 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71638eb4-fb1c-42be-84d6-d900ad27f196-serving-cert\") pod \"authentication-operator-69f744f599-9zhhp\" (UID: \"71638eb4-fb1c-42be-84d6-d900ad27f196\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9zhhp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374324 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9b13597-8879-40d4-965b-580222915295-console-serving-cert\") pod \"console-f9d7485db-9h2hf\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374342 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jlj2\" (UniqueName: \"kubernetes.io/projected/981371a6-bce6-4fc3-a2ea-2f4c0e26072c-kube-api-access-6jlj2\") pod \"openshift-apiserver-operator-796bbdcf4f-xtkxw\" (UID: \"981371a6-bce6-4fc3-a2ea-2f4c0e26072c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtkxw" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374405 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/79fb293c-379a-47ac-a9b1-23746cb0758e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zw6lz\" (UID: \"79fb293c-379a-47ac-a9b1-23746cb0758e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374485 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcfrt\" (UniqueName: \"kubernetes.io/projected/6b37aedc-f307-4f09-899e-db5b01f89c92-kube-api-access-lcfrt\") pod \"machine-approver-56656f9798-2b8xg\" (UID: \"6b37aedc-f307-4f09-899e-db5b01f89c92\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2b8xg" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374526 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b1482be7-a50a-43ca-974a-d49cad628e46-audit\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374564 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/79fb293c-379a-47ac-a9b1-23746cb0758e-etcd-client\") pod \"apiserver-7bbb656c7d-zw6lz\" (UID: \"79fb293c-379a-47ac-a9b1-23746cb0758e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374588 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ebf9d1e-e313-440d-992a-9e0ede5b2b24-config\") pod \"machine-api-operator-5694c8668f-fqhrm\" (UID: \"1ebf9d1e-e313-440d-992a-9e0ede5b2b24\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fqhrm" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374605 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tsjg\" (UniqueName: \"kubernetes.io/projected/1d8cdbb3-b672-4984-8d03-562965a7b081-kube-api-access-5tsjg\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374652 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/c546b6c9-bd0c-4a50-9521-49af89f3ede1-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zhzdb\" (UID: \"c546b6c9-bd0c-4a50-9521-49af89f3ede1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhzdb" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374673 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4134a970-5107-41d2-8fbe-336387a17b77-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-bmjgh\" (UID: \"4134a970-5107-41d2-8fbe-336387a17b77\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bmjgh" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374688 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9b13597-8879-40d4-965b-580222915295-console-config\") pod \"console-f9d7485db-9h2hf\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374710 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374726 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79fb293c-379a-47ac-a9b1-23746cb0758e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zw6lz\" (UID: \"79fb293c-379a-47ac-a9b1-23746cb0758e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374749 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b1482be7-a50a-43ca-974a-d49cad628e46-audit-dir\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374778 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7f794184-a546-44a9-ab3b-47d69b306384-srv-cert\") pod \"catalog-operator-68c6474976-t68jc\" (UID: \"7f794184-a546-44a9-ab3b-47d69b306384\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t68jc" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374793 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f43c1a5-4813-44e6-b2b9-53b134283a59-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vggbp\" (UID: \"9f43c1a5-4813-44e6-b2b9-53b134283a59\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vggbp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374827 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/038c87fd-172d-4e34-8927-92bafb47879a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7b6g5\" (UID: \"038c87fd-172d-4e34-8927-92bafb47879a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7b6g5" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374843 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vscww\" (UniqueName: \"kubernetes.io/projected/f7b4b17a-ac26-4e3a-8f67-d0c855b6aa95-kube-api-access-vscww\") pod \"control-plane-machine-set-operator-78cbb6b69f-4cw7d\" (UID: \"f7b4b17a-ac26-4e3a-8f67-d0c855b6aa95\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4cw7d" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374886 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1482be7-a50a-43ca-974a-d49cad628e46-config\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374904 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374936 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/79fb293c-379a-47ac-a9b1-23746cb0758e-audit-policies\") pod \"apiserver-7bbb656c7d-zw6lz\" (UID: \"79fb293c-379a-47ac-a9b1-23746cb0758e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.374964 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6b37aedc-f307-4f09-899e-db5b01f89c92-auth-proxy-config\") pod \"machine-approver-56656f9798-2b8xg\" (UID: \"6b37aedc-f307-4f09-899e-db5b01f89c92\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2b8xg" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375004 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375032 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79fb293c-379a-47ac-a9b1-23746cb0758e-serving-cert\") pod \"apiserver-7bbb656c7d-zw6lz\" (UID: \"79fb293c-379a-47ac-a9b1-23746cb0758e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375073 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9b13597-8879-40d4-965b-580222915295-service-ca\") pod \"console-f9d7485db-9h2hf\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375090 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b1482be7-a50a-43ca-974a-d49cad628e46-encryption-config\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375104 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1d8cdbb3-b672-4984-8d03-562965a7b081-audit-policies\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375159 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a32bd13d-f885-4653-8868-89fc4a8ac111-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ph5wk\" (UID: \"a32bd13d-f885-4653-8868-89fc4a8ac111\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ph5wk" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375179 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b37aedc-f307-4f09-899e-db5b01f89c92-config\") pod \"machine-approver-56656f9798-2b8xg\" (UID: \"6b37aedc-f307-4f09-899e-db5b01f89c92\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2b8xg" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375199 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375249 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a-serving-cert\") pod \"route-controller-manager-6576b87f9c-lwfvg\" (UID: \"2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375332 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4baeac0b-e75b-49b5-9206-4cd99a4764f6-config\") pod \"console-operator-58897d9998-48d4r\" (UID: \"4baeac0b-e75b-49b5-9206-4cd99a4764f6\") " pod="openshift-console-operator/console-operator-58897d9998-48d4r" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375355 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6b37aedc-f307-4f09-899e-db5b01f89c92-machine-approver-tls\") pod \"machine-approver-56656f9798-2b8xg\" (UID: \"6b37aedc-f307-4f09-899e-db5b01f89c92\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2b8xg" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375395 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375425 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4baeac0b-e75b-49b5-9206-4cd99a4764f6-serving-cert\") pod \"console-operator-58897d9998-48d4r\" (UID: \"4baeac0b-e75b-49b5-9206-4cd99a4764f6\") " pod="openshift-console-operator/console-operator-58897d9998-48d4r" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375477 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71638eb4-fb1c-42be-84d6-d900ad27f196-service-ca-bundle\") pod \"authentication-operator-69f744f599-9zhhp\" (UID: \"71638eb4-fb1c-42be-84d6-d900ad27f196\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9zhhp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375500 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4baeac0b-e75b-49b5-9206-4cd99a4764f6-trusted-ca\") pod \"console-operator-58897d9998-48d4r\" (UID: \"4baeac0b-e75b-49b5-9206-4cd99a4764f6\") " pod="openshift-console-operator/console-operator-58897d9998-48d4r" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375515 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/79fb293c-379a-47ac-a9b1-23746cb0758e-encryption-config\") pod \"apiserver-7bbb656c7d-zw6lz\" (UID: \"79fb293c-379a-47ac-a9b1-23746cb0758e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375532 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9f43c1a5-4813-44e6-b2b9-53b134283a59-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vggbp\" (UID: \"9f43c1a5-4813-44e6-b2b9-53b134283a59\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vggbp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375547 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7lwn\" (UniqueName: \"kubernetes.io/projected/1ebf9d1e-e313-440d-992a-9e0ede5b2b24-kube-api-access-s7lwn\") pod \"machine-api-operator-5694c8668f-fqhrm\" (UID: \"1ebf9d1e-e313-440d-992a-9e0ede5b2b24\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fqhrm" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375576 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b0fba000-75c8-49be-945e-fc41fabf370c-client-ca\") pod \"controller-manager-879f6c89f-6xbj4\" (UID: \"b0fba000-75c8-49be-945e-fc41fabf370c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375593 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375609 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7f794184-a546-44a9-ab3b-47d69b306384-profile-collector-cert\") pod \"catalog-operator-68c6474976-t68jc\" (UID: \"7f794184-a546-44a9-ab3b-47d69b306384\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t68jc" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375641 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrpcf\" (UniqueName: \"kubernetes.io/projected/038c87fd-172d-4e34-8927-92bafb47879a-kube-api-access-vrpcf\") pod \"openshift-controller-manager-operator-756b6f6bc6-7b6g5\" (UID: \"038c87fd-172d-4e34-8927-92bafb47879a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7b6g5" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375661 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw6st\" (UniqueName: \"kubernetes.io/projected/c546b6c9-bd0c-4a50-9521-49af89f3ede1-kube-api-access-sw6st\") pod \"openshift-config-operator-7777fb866f-zhzdb\" (UID: \"c546b6c9-bd0c-4a50-9521-49af89f3ede1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhzdb" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375675 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b1482be7-a50a-43ca-974a-d49cad628e46-etcd-serving-ca\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375689 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1482be7-a50a-43ca-974a-d49cad628e46-serving-cert\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375825 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqns5\" (UniqueName: \"kubernetes.io/projected/71638eb4-fb1c-42be-84d6-d900ad27f196-kube-api-access-kqns5\") pod \"authentication-operator-69f744f599-9zhhp\" (UID: \"71638eb4-fb1c-42be-84d6-d900ad27f196\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9zhhp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375850 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414280b4-3299-4fee-a33c-a231b66000c7-metrics-certs\") pod \"router-default-5444994796-gxh7z\" (UID: \"414280b4-3299-4fee-a33c-a231b66000c7\") " pod="openshift-ingress/router-default-5444994796-gxh7z" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375867 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b1482be7-a50a-43ca-974a-d49cad628e46-image-import-ca\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375881 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d8cdbb3-b672-4984-8d03-562965a7b081-audit-dir\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375909 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a-config\") pod \"route-controller-manager-6576b87f9c-lwfvg\" (UID: \"2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375911 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.375993 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9b13597-8879-40d4-965b-580222915295-console-oauth-config\") pod \"console-f9d7485db-9h2hf\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.376025 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcbb8\" (UniqueName: \"kubernetes.io/projected/2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a-kube-api-access-vcbb8\") pod \"route-controller-manager-6576b87f9c-lwfvg\" (UID: \"2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.376041 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b1482be7-a50a-43ca-974a-d49cad628e46-etcd-client\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.376063 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f43c1a5-4813-44e6-b2b9-53b134283a59-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vggbp\" (UID: \"9f43c1a5-4813-44e6-b2b9-53b134283a59\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vggbp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.376079 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htkx7\" (UniqueName: \"kubernetes.io/projected/7f794184-a546-44a9-ab3b-47d69b306384-kube-api-access-htkx7\") pod \"catalog-operator-68c6474976-t68jc\" (UID: \"7f794184-a546-44a9-ab3b-47d69b306384\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t68jc" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.376115 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0fba000-75c8-49be-945e-fc41fabf370c-serving-cert\") pod \"controller-manager-879f6c89f-6xbj4\" (UID: \"b0fba000-75c8-49be-945e-fc41fabf370c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.376134 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgbwd\" (UniqueName: \"kubernetes.io/projected/b0fba000-75c8-49be-945e-fc41fabf370c-kube-api-access-fgbwd\") pod \"controller-manager-879f6c89f-6xbj4\" (UID: \"b0fba000-75c8-49be-945e-fc41fabf370c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.376150 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9b13597-8879-40d4-965b-580222915295-trusted-ca-bundle\") pod \"console-f9d7485db-9h2hf\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.376168 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.376212 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9b13597-8879-40d4-965b-580222915295-oauth-serving-cert\") pod \"console-f9d7485db-9h2hf\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.376233 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zzg7\" (UniqueName: \"kubernetes.io/projected/a9b13597-8879-40d4-965b-580222915295-kube-api-access-7zzg7\") pod \"console-f9d7485db-9h2hf\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.376973 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.377008 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.377296 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-q62cm"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.380579 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-q62cm" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.382097 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.393988 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mq695"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.394703 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mq695" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.397238 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.400252 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.404493 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.409674 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xdqnq"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.410319 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dx9bs"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.410672 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dx9bs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.410880 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xdqnq" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.411157 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-djpdz"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.411850 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-djpdz" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.412739 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgzvl"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.413320 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgzvl" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.414720 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-86pgj"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.415503 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-86pgj" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.416894 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fqhrm"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.417868 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.419646 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lf2lr"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.428307 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521980-llz2m"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.428700 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.428721 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ph5wk"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.428817 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lf2lr" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.428895 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521980-llz2m" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.428928 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7gvvt"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.431254 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.434483 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9zhhp"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.434519 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-jsjk5"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.434562 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7gvvt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.434914 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-b8gqs"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.434940 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-t8l8g"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.435101 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-jsjk5" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.435389 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7b6g5"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.435412 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.435422 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xzdww"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.435501 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-t8l8g" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.435924 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtkxw"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.437025 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zhzdb"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.439450 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-47hsw"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.440441 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4cw7d"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.441481 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t68jc"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.442657 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-7vmx8"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.443455 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4cgbp"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.444477 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-9h2hf"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.445497 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wlh58"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.447034 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-q62cm"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.448188 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6xbj4"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.449814 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mq695"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.450430 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bmjgh"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.450883 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.451821 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vggbp"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.452883 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-48d4r"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.453967 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xdqnq"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.455053 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-d4p2l"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.456164 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c7jxs"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.457233 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-6rbxt"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.458199 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lf2lr"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.459585 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dx9bs"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.460272 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wks4t"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.461591 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4hfc2"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.462544 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xc9l6"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.464985 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-blj8p"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.465946 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-blj8p" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.466107 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.467212 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-86pgj"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.472939 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.474454 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521980-llz2m"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.475957 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgzvl"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477006 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpt56\" (UniqueName: \"kubernetes.io/projected/a32bd13d-f885-4653-8868-89fc4a8ac111-kube-api-access-qpt56\") pod \"cluster-samples-operator-665b6dd947-ph5wk\" (UID: \"a32bd13d-f885-4653-8868-89fc4a8ac111\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ph5wk" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477055 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f7b4b17a-ac26-4e3a-8f67-d0c855b6aa95-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-4cw7d\" (UID: \"f7b4b17a-ac26-4e3a-8f67-d0c855b6aa95\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4cw7d" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477076 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b1482be7-a50a-43ca-974a-d49cad628e46-node-pullsecrets\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477117 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcxwv\" (UniqueName: \"kubernetes.io/projected/f4d0b3bb-c027-4390-92ac-66aad8bf0d19-kube-api-access-gcxwv\") pod \"collect-profiles-29521980-llz2m\" (UID: \"f4d0b3bb-c027-4390-92ac-66aad8bf0d19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521980-llz2m" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477136 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87655950-5426-4cb0-a10b-f71b0fbb0549-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-tgzvl\" (UID: \"87655950-5426-4cb0-a10b-f71b0fbb0549\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgzvl" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477154 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/414280b4-3299-4fee-a33c-a231b66000c7-stats-auth\") pod \"router-default-5444994796-gxh7z\" (UID: \"414280b4-3299-4fee-a33c-a231b66000c7\") " pod="openshift-ingress/router-default-5444994796-gxh7z" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477187 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/834efefd-4b1f-45e3-9085-6c0dab5f4870-signing-key\") pod \"service-ca-9c57cc56f-t8l8g\" (UID: \"834efefd-4b1f-45e3-9085-6c0dab5f4870\") " pod="openshift-service-ca/service-ca-9c57cc56f-t8l8g" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477207 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71638eb4-fb1c-42be-84d6-d900ad27f196-config\") pod \"authentication-operator-69f744f599-9zhhp\" (UID: \"71638eb4-fb1c-42be-84d6-d900ad27f196\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9zhhp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477243 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5ece5ca-10cc-4597-b14e-60eb9091c7ff-serving-cert\") pod \"etcd-operator-b45778765-6rbxt\" (UID: \"d5ece5ca-10cc-4597-b14e-60eb9091c7ff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6rbxt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477279 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b0fba000-75c8-49be-945e-fc41fabf370c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6xbj4\" (UID: \"b0fba000-75c8-49be-945e-fc41fabf370c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477299 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/79fb293c-379a-47ac-a9b1-23746cb0758e-audit-dir\") pod \"apiserver-7bbb656c7d-zw6lz\" (UID: \"79fb293c-379a-47ac-a9b1-23746cb0758e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477317 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/73e08276-9d2b-4ff7-a484-f19bfc28a9ec-metrics-tls\") pod \"dns-operator-744455d44c-c7jxs\" (UID: \"73e08276-9d2b-4ff7-a484-f19bfc28a9ec\") " pod="openshift-dns-operator/dns-operator-744455d44c-c7jxs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477356 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/79fb293c-379a-47ac-a9b1-23746cb0758e-audit-dir\") pod \"apiserver-7bbb656c7d-zw6lz\" (UID: \"79fb293c-379a-47ac-a9b1-23746cb0758e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477381 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d5ece5ca-10cc-4597-b14e-60eb9091c7ff-etcd-ca\") pod \"etcd-operator-b45778765-6rbxt\" (UID: \"d5ece5ca-10cc-4597-b14e-60eb9091c7ff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6rbxt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477397 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4d0b3bb-c027-4390-92ac-66aad8bf0d19-config-volume\") pod \"collect-profiles-29521980-llz2m\" (UID: \"f4d0b3bb-c027-4390-92ac-66aad8bf0d19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521980-llz2m" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477445 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3d51261-dd68-44ef-9329-fdb8e325d504-config\") pod \"kube-controller-manager-operator-78b949d7b-47hsw\" (UID: \"a3d51261-dd68-44ef-9329-fdb8e325d504\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-47hsw" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477463 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dd116ca8-9aec-4817-a7b2-a242e01a0a2e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xdqnq\" (UID: \"dd116ca8-9aec-4817-a7b2-a242e01a0a2e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xdqnq" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477493 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71638eb4-fb1c-42be-84d6-d900ad27f196-serving-cert\") pod \"authentication-operator-69f744f599-9zhhp\" (UID: \"71638eb4-fb1c-42be-84d6-d900ad27f196\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9zhhp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477511 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9b13597-8879-40d4-965b-580222915295-console-serving-cert\") pod \"console-f9d7485db-9h2hf\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477529 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jlj2\" (UniqueName: \"kubernetes.io/projected/981371a6-bce6-4fc3-a2ea-2f4c0e26072c-kube-api-access-6jlj2\") pod \"openshift-apiserver-operator-796bbdcf4f-xtkxw\" (UID: \"981371a6-bce6-4fc3-a2ea-2f4c0e26072c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtkxw" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477573 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/79fb293c-379a-47ac-a9b1-23746cb0758e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zw6lz\" (UID: \"79fb293c-379a-47ac-a9b1-23746cb0758e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477591 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3d51261-dd68-44ef-9329-fdb8e325d504-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-47hsw\" (UID: \"a3d51261-dd68-44ef-9329-fdb8e325d504\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-47hsw" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477611 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/c546b6c9-bd0c-4a50-9521-49af89f3ede1-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zhzdb\" (UID: \"c546b6c9-bd0c-4a50-9521-49af89f3ede1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhzdb" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477658 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9b13597-8879-40d4-965b-580222915295-console-config\") pod \"console-f9d7485db-9h2hf\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477675 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tsjg\" (UniqueName: \"kubernetes.io/projected/1d8cdbb3-b672-4984-8d03-562965a7b081-kube-api-access-5tsjg\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477691 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b1482be7-a50a-43ca-974a-d49cad628e46-audit-dir\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477707 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f43c1a5-4813-44e6-b2b9-53b134283a59-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vggbp\" (UID: \"9f43c1a5-4813-44e6-b2b9-53b134283a59\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vggbp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477731 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1482be7-a50a-43ca-974a-d49cad628e46-config\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477747 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477832 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6b37aedc-f307-4f09-899e-db5b01f89c92-auth-proxy-config\") pod \"machine-approver-56656f9798-2b8xg\" (UID: \"6b37aedc-f307-4f09-899e-db5b01f89c92\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2b8xg" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477848 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477867 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b26097da-27bf-41ed-9010-8a967d0bc173-serving-cert\") pod \"service-ca-operator-777779d784-86pgj\" (UID: \"b26097da-27bf-41ed-9010-8a967d0bc173\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-86pgj" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477883 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b02fa3f4-70d0-4ea7-8359-5cb6611ef778-node-bootstrap-token\") pod \"machine-config-server-jsjk5\" (UID: \"b02fa3f4-70d0-4ea7-8359-5cb6611ef778\") " pod="openshift-machine-config-operator/machine-config-server-jsjk5" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477900 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79fb293c-379a-47ac-a9b1-23746cb0758e-serving-cert\") pod \"apiserver-7bbb656c7d-zw6lz\" (UID: \"79fb293c-379a-47ac-a9b1-23746cb0758e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477938 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9b13597-8879-40d4-965b-580222915295-service-ca\") pod \"console-f9d7485db-9h2hf\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477954 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1d8cdbb3-b672-4984-8d03-562965a7b081-audit-policies\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477974 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd116ca8-9aec-4817-a7b2-a242e01a0a2e-proxy-tls\") pod \"machine-config-operator-74547568cd-xdqnq\" (UID: \"dd116ca8-9aec-4817-a7b2-a242e01a0a2e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xdqnq" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477990 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8v62\" (UniqueName: \"kubernetes.io/projected/d5ece5ca-10cc-4597-b14e-60eb9091c7ff-kube-api-access-f8v62\") pod \"etcd-operator-b45778765-6rbxt\" (UID: \"d5ece5ca-10cc-4597-b14e-60eb9091c7ff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6rbxt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.478015 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b37aedc-f307-4f09-899e-db5b01f89c92-config\") pod \"machine-approver-56656f9798-2b8xg\" (UID: \"6b37aedc-f307-4f09-899e-db5b01f89c92\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2b8xg" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.478103 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71638eb4-fb1c-42be-84d6-d900ad27f196-config\") pod \"authentication-operator-69f744f599-9zhhp\" (UID: \"71638eb4-fb1c-42be-84d6-d900ad27f196\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9zhhp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.478130 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4baeac0b-e75b-49b5-9206-4cd99a4764f6-config\") pod \"console-operator-58897d9998-48d4r\" (UID: \"4baeac0b-e75b-49b5-9206-4cd99a4764f6\") " pod="openshift-console-operator/console-operator-58897d9998-48d4r" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.478157 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6b37aedc-f307-4f09-899e-db5b01f89c92-machine-approver-tls\") pod \"machine-approver-56656f9798-2b8xg\" (UID: \"6b37aedc-f307-4f09-899e-db5b01f89c92\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2b8xg" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.478177 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71638eb4-fb1c-42be-84d6-d900ad27f196-service-ca-bundle\") pod \"authentication-operator-69f744f599-9zhhp\" (UID: \"71638eb4-fb1c-42be-84d6-d900ad27f196\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9zhhp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.478196 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4baeac0b-e75b-49b5-9206-4cd99a4764f6-serving-cert\") pod \"console-operator-58897d9998-48d4r\" (UID: \"4baeac0b-e75b-49b5-9206-4cd99a4764f6\") " pod="openshift-console-operator/console-operator-58897d9998-48d4r" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.478211 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/79fb293c-379a-47ac-a9b1-23746cb0758e-encryption-config\") pod \"apiserver-7bbb656c7d-zw6lz\" (UID: \"79fb293c-379a-47ac-a9b1-23746cb0758e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.478229 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd27b6bd-2fc3-4fb3-8187-d08fccb41c82-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wlh58\" (UID: \"cd27b6bd-2fc3-4fb3-8187-d08fccb41c82\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wlh58" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.477244 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b1482be7-a50a-43ca-974a-d49cad628e46-node-pullsecrets\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.478196 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-blj8p"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.478600 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b1482be7-a50a-43ca-974a-d49cad628e46-audit-dir\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.478692 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5ece5ca-10cc-4597-b14e-60eb9091c7ff-config\") pod \"etcd-operator-b45778765-6rbxt\" (UID: \"d5ece5ca-10cc-4597-b14e-60eb9091c7ff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6rbxt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.478724 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b0fba000-75c8-49be-945e-fc41fabf370c-client-ca\") pod \"controller-manager-879f6c89f-6xbj4\" (UID: \"b0fba000-75c8-49be-945e-fc41fabf370c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.478750 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw6st\" (UniqueName: \"kubernetes.io/projected/c546b6c9-bd0c-4a50-9521-49af89f3ede1-kube-api-access-sw6st\") pod \"openshift-config-operator-7777fb866f-zhzdb\" (UID: \"c546b6c9-bd0c-4a50-9521-49af89f3ede1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhzdb" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.478779 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1482be7-a50a-43ca-974a-d49cad628e46-serving-cert\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.478797 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b02fa3f4-70d0-4ea7-8359-5cb6611ef778-certs\") pod \"machine-config-server-jsjk5\" (UID: \"b02fa3f4-70d0-4ea7-8359-5cb6611ef778\") " pod="openshift-machine-config-operator/machine-config-server-jsjk5" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.478813 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414280b4-3299-4fee-a33c-a231b66000c7-metrics-certs\") pod \"router-default-5444994796-gxh7z\" (UID: \"414280b4-3299-4fee-a33c-a231b66000c7\") " pod="openshift-ingress/router-default-5444994796-gxh7z" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.478828 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b1482be7-a50a-43ca-974a-d49cad628e46-image-import-ca\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.478847 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcbb8\" (UniqueName: \"kubernetes.io/projected/2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a-kube-api-access-vcbb8\") pod \"route-controller-manager-6576b87f9c-lwfvg\" (UID: \"2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.478865 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7771668e-6442-46eb-a2a5-53fd35396ef4-proxy-tls\") pod \"machine-config-controller-84d6567774-4cgbp\" (UID: \"7771668e-6442-46eb-a2a5-53fd35396ef4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4cgbp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.478885 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzpfb\" (UniqueName: \"kubernetes.io/projected/6e53ded1-38a3-4129-bdfb-e0ef6ca1b748-kube-api-access-tzpfb\") pod \"multus-admission-controller-857f4d67dd-q62cm\" (UID: \"6e53ded1-38a3-4129-bdfb-e0ef6ca1b748\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-q62cm" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.478913 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b0fba000-75c8-49be-945e-fc41fabf370c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6xbj4\" (UID: \"b0fba000-75c8-49be-945e-fc41fabf370c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.479229 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9b13597-8879-40d4-965b-580222915295-service-ca\") pod \"console-f9d7485db-9h2hf\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.479650 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4baeac0b-e75b-49b5-9206-4cd99a4764f6-config\") pod \"console-operator-58897d9998-48d4r\" (UID: \"4baeac0b-e75b-49b5-9206-4cd99a4764f6\") " pod="openshift-console-operator/console-operator-58897d9998-48d4r" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.479867 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1482be7-a50a-43ca-974a-d49cad628e46-config\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.479874 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1d8cdbb3-b672-4984-8d03-562965a7b081-audit-policies\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.480029 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/c546b6c9-bd0c-4a50-9521-49af89f3ede1-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zhzdb\" (UID: \"c546b6c9-bd0c-4a50-9521-49af89f3ede1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhzdb" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.480242 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/79fb293c-379a-47ac-a9b1-23746cb0758e-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-zw6lz\" (UID: \"79fb293c-379a-47ac-a9b1-23746cb0758e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.480492 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9b13597-8879-40d4-965b-580222915295-console-config\") pod \"console-f9d7485db-9h2hf\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.480736 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/6b37aedc-f307-4f09-899e-db5b01f89c92-auth-proxy-config\") pod \"machine-approver-56656f9798-2b8xg\" (UID: \"6b37aedc-f307-4f09-899e-db5b01f89c92\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2b8xg" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.480791 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b37aedc-f307-4f09-899e-db5b01f89c92-config\") pod \"machine-approver-56656f9798-2b8xg\" (UID: \"6b37aedc-f307-4f09-899e-db5b01f89c92\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2b8xg" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.480931 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71638eb4-fb1c-42be-84d6-d900ad27f196-service-ca-bundle\") pod \"authentication-operator-69f744f599-9zhhp\" (UID: \"71638eb4-fb1c-42be-84d6-d900ad27f196\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9zhhp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.481282 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.481327 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-t8l8g"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.481487 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b0fba000-75c8-49be-945e-fc41fabf370c-client-ca\") pod \"controller-manager-879f6c89f-6xbj4\" (UID: \"b0fba000-75c8-49be-945e-fc41fabf370c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.482450 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/b1482be7-a50a-43ca-974a-d49cad628e46-image-import-ca\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.482693 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xc9l6"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.483868 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-djpdz"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.483913 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd27b6bd-2fc3-4fb3-8187-d08fccb41c82-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wlh58\" (UID: \"cd27b6bd-2fc3-4fb3-8187-d08fccb41c82\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wlh58" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.483986 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4baeac0b-e75b-49b5-9206-4cd99a4764f6-serving-cert\") pod \"console-operator-58897d9998-48d4r\" (UID: \"4baeac0b-e75b-49b5-9206-4cd99a4764f6\") " pod="openshift-console-operator/console-operator-58897d9998-48d4r" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.483988 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zzg7\" (UniqueName: \"kubernetes.io/projected/a9b13597-8879-40d4-965b-580222915295-kube-api-access-7zzg7\") pod \"console-f9d7485db-9h2hf\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484112 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd27b6bd-2fc3-4fb3-8187-d08fccb41c82-config\") pod \"kube-apiserver-operator-766d6c64bb-wlh58\" (UID: \"cd27b6bd-2fc3-4fb3-8187-d08fccb41c82\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wlh58" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484140 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1ebf9d1e-e313-440d-992a-9e0ede5b2b24-images\") pod \"machine-api-operator-5694c8668f-fqhrm\" (UID: \"1ebf9d1e-e313-440d-992a-9e0ede5b2b24\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fqhrm" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484167 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/dd116ca8-9aec-4817-a7b2-a242e01a0a2e-images\") pod \"machine-config-operator-74547568cd-xdqnq\" (UID: \"dd116ca8-9aec-4817-a7b2-a242e01a0a2e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xdqnq" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484195 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/038c87fd-172d-4e34-8927-92bafb47879a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7b6g5\" (UID: \"038c87fd-172d-4e34-8927-92bafb47879a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7b6g5" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484206 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/6b37aedc-f307-4f09-899e-db5b01f89c92-machine-approver-tls\") pod \"machine-approver-56656f9798-2b8xg\" (UID: \"6b37aedc-f307-4f09-899e-db5b01f89c92\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2b8xg" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484218 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71638eb4-fb1c-42be-84d6-d900ad27f196-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9zhhp\" (UID: \"71638eb4-fb1c-42be-84d6-d900ad27f196\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9zhhp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484243 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4134a970-5107-41d2-8fbe-336387a17b77-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-bmjgh\" (UID: \"4134a970-5107-41d2-8fbe-336387a17b77\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bmjgh" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484267 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b1482be7-a50a-43ca-974a-d49cad628e46-serving-cert\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484268 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t97g5\" (UniqueName: \"kubernetes.io/projected/575767dd-6121-4745-aae9-c5434aee72d5-kube-api-access-t97g5\") pod \"downloads-7954f5f757-7vmx8\" (UID: \"575767dd-6121-4745-aae9-c5434aee72d5\") " pod="openshift-console/downloads-7954f5f757-7vmx8" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484337 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484361 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wq94s\" (UniqueName: \"kubernetes.io/projected/3afb3ce4-f468-4042-b4d5-61285893e7e1-kube-api-access-wq94s\") pod \"marketplace-operator-79b997595-mq695\" (UID: \"3afb3ce4-f468-4042-b4d5-61285893e7e1\") " pod="openshift-marketplace/marketplace-operator-79b997595-mq695" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484405 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d5ece5ca-10cc-4597-b14e-60eb9091c7ff-etcd-client\") pod \"etcd-operator-b45778765-6rbxt\" (UID: \"d5ece5ca-10cc-4597-b14e-60eb9091c7ff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6rbxt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484435 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8vl6\" (UniqueName: \"kubernetes.io/projected/73e08276-9d2b-4ff7-a484-f19bfc28a9ec-kube-api-access-x8vl6\") pod \"dns-operator-744455d44c-c7jxs\" (UID: \"73e08276-9d2b-4ff7-a484-f19bfc28a9ec\") " pod="openshift-dns-operator/dns-operator-744455d44c-c7jxs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484456 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b26097da-27bf-41ed-9010-8a967d0bc173-config\") pod \"service-ca-operator-777779d784-86pgj\" (UID: \"b26097da-27bf-41ed-9010-8a967d0bc173\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-86pgj" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484505 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvdxj\" (UniqueName: \"kubernetes.io/projected/79fb293c-379a-47ac-a9b1-23746cb0758e-kube-api-access-kvdxj\") pod \"apiserver-7bbb656c7d-zw6lz\" (UID: \"79fb293c-379a-47ac-a9b1-23746cb0758e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484531 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484574 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/834efefd-4b1f-45e3-9085-6c0dab5f4870-signing-cabundle\") pod \"service-ca-9c57cc56f-t8l8g\" (UID: \"834efefd-4b1f-45e3-9085-6c0dab5f4870\") " pod="openshift-service-ca/service-ca-9c57cc56f-t8l8g" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484601 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1482be7-a50a-43ca-974a-d49cad628e46-trusted-ca-bundle\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484655 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a-client-ca\") pod \"route-controller-manager-6576b87f9c-lwfvg\" (UID: \"2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484679 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0fba000-75c8-49be-945e-fc41fabf370c-config\") pod \"controller-manager-879f6c89f-6xbj4\" (UID: \"b0fba000-75c8-49be-945e-fc41fabf370c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484702 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/981371a6-bce6-4fc3-a2ea-2f4c0e26072c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xtkxw\" (UID: \"981371a6-bce6-4fc3-a2ea-2f4c0e26072c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtkxw" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484750 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7q77d\" (UniqueName: \"kubernetes.io/projected/15500d79-d7c7-4a0c-965d-4783f4a85b2c-kube-api-access-7q77d\") pod \"migrator-59844c95c7-wks4t\" (UID: \"15500d79-d7c7-4a0c-965d-4783f4a85b2c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wks4t" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484804 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1ebf9d1e-e313-440d-992a-9e0ede5b2b24-images\") pod \"machine-api-operator-5694c8668f-fqhrm\" (UID: \"1ebf9d1e-e313-440d-992a-9e0ede5b2b24\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fqhrm" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484808 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7gvvt"] Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484807 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztfzt\" (UniqueName: \"kubernetes.io/projected/b26097da-27bf-41ed-9010-8a967d0bc173-kube-api-access-ztfzt\") pod \"service-ca-operator-777779d784-86pgj\" (UID: \"b26097da-27bf-41ed-9010-8a967d0bc173\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-86pgj" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484969 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwwpg\" (UniqueName: \"kubernetes.io/projected/4134a970-5107-41d2-8fbe-336387a17b77-kube-api-access-zwwpg\") pod \"cluster-image-registry-operator-dc59b4c8b-bmjgh\" (UID: \"4134a970-5107-41d2-8fbe-336387a17b77\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bmjgh" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485028 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65cgg\" (UniqueName: \"kubernetes.io/projected/414280b4-3299-4fee-a33c-a231b66000c7-kube-api-access-65cgg\") pod \"router-default-5444994796-gxh7z\" (UID: \"414280b4-3299-4fee-a33c-a231b66000c7\") " pod="openshift-ingress/router-default-5444994796-gxh7z" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485062 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/981371a6-bce6-4fc3-a2ea-2f4c0e26072c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xtkxw\" (UID: \"981371a6-bce6-4fc3-a2ea-2f4c0e26072c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtkxw" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485103 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d5ece5ca-10cc-4597-b14e-60eb9091c7ff-etcd-service-ca\") pod \"etcd-operator-b45778765-6rbxt\" (UID: \"d5ece5ca-10cc-4597-b14e-60eb9091c7ff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6rbxt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485124 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dfvz\" (UniqueName: \"kubernetes.io/projected/b02fa3f4-70d0-4ea7-8359-5cb6611ef778-kube-api-access-5dfvz\") pod \"machine-config-server-jsjk5\" (UID: \"b02fa3f4-70d0-4ea7-8359-5cb6611ef778\") " pod="openshift-machine-config-operator/machine-config-server-jsjk5" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.484972 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/038c87fd-172d-4e34-8927-92bafb47879a-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-7b6g5\" (UID: \"038c87fd-172d-4e34-8927-92bafb47879a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7b6g5" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485198 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcfrt\" (UniqueName: \"kubernetes.io/projected/6b37aedc-f307-4f09-899e-db5b01f89c92-kube-api-access-lcfrt\") pod \"machine-approver-56656f9798-2b8xg\" (UID: \"6b37aedc-f307-4f09-899e-db5b01f89c92\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2b8xg" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485214 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b1482be7-a50a-43ca-974a-d49cad628e46-audit\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485232 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/79fb293c-379a-47ac-a9b1-23746cb0758e-etcd-client\") pod \"apiserver-7bbb656c7d-zw6lz\" (UID: \"79fb293c-379a-47ac-a9b1-23746cb0758e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485249 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ebf9d1e-e313-440d-992a-9e0ede5b2b24-config\") pod \"machine-api-operator-5694c8668f-fqhrm\" (UID: \"1ebf9d1e-e313-440d-992a-9e0ede5b2b24\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fqhrm" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485266 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6e53ded1-38a3-4129-bdfb-e0ef6ca1b748-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-q62cm\" (UID: \"6e53ded1-38a3-4129-bdfb-e0ef6ca1b748\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-q62cm" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485283 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4134a970-5107-41d2-8fbe-336387a17b77-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-bmjgh\" (UID: \"4134a970-5107-41d2-8fbe-336387a17b77\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bmjgh" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485299 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485318 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79fb293c-379a-47ac-a9b1-23746cb0758e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zw6lz\" (UID: \"79fb293c-379a-47ac-a9b1-23746cb0758e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485432 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7f794184-a546-44a9-ab3b-47d69b306384-srv-cert\") pod \"catalog-operator-68c6474976-t68jc\" (UID: \"7f794184-a546-44a9-ab3b-47d69b306384\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t68jc" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485453 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/038c87fd-172d-4e34-8927-92bafb47879a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7b6g5\" (UID: \"038c87fd-172d-4e34-8927-92bafb47879a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7b6g5" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485456 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/71638eb4-fb1c-42be-84d6-d900ad27f196-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-9zhhp\" (UID: \"71638eb4-fb1c-42be-84d6-d900ad27f196\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9zhhp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485471 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vscww\" (UniqueName: \"kubernetes.io/projected/f7b4b17a-ac26-4e3a-8f67-d0c855b6aa95-kube-api-access-vscww\") pod \"control-plane-machine-set-operator-78cbb6b69f-4cw7d\" (UID: \"f7b4b17a-ac26-4e3a-8f67-d0c855b6aa95\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4cw7d" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485512 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3afb3ce4-f468-4042-b4d5-61285893e7e1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mq695\" (UID: \"3afb3ce4-f468-4042-b4d5-61285893e7e1\") " pod="openshift-marketplace/marketplace-operator-79b997595-mq695" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485532 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9gx9\" (UniqueName: \"kubernetes.io/projected/dd116ca8-9aec-4817-a7b2-a242e01a0a2e-kube-api-access-p9gx9\") pod \"machine-config-operator-74547568cd-xdqnq\" (UID: \"dd116ca8-9aec-4817-a7b2-a242e01a0a2e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xdqnq" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485539 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a-client-ca\") pod \"route-controller-manager-6576b87f9c-lwfvg\" (UID: \"2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485550 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/79fb293c-379a-47ac-a9b1-23746cb0758e-audit-policies\") pod \"apiserver-7bbb656c7d-zw6lz\" (UID: \"79fb293c-379a-47ac-a9b1-23746cb0758e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485585 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b1482be7-a50a-43ca-974a-d49cad628e46-encryption-config\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485606 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a32bd13d-f885-4653-8868-89fc4a8ac111-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ph5wk\" (UID: \"a32bd13d-f885-4653-8868-89fc4a8ac111\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ph5wk" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485627 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485645 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a-serving-cert\") pod \"route-controller-manager-6576b87f9c-lwfvg\" (UID: \"2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485671 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.485686 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.486152 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1482be7-a50a-43ca-974a-d49cad628e46-trusted-ca-bundle\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.486238 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ebf9d1e-e313-440d-992a-9e0ede5b2b24-config\") pod \"machine-api-operator-5694c8668f-fqhrm\" (UID: \"1ebf9d1e-e313-440d-992a-9e0ede5b2b24\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fqhrm" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.486369 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71638eb4-fb1c-42be-84d6-d900ad27f196-serving-cert\") pod \"authentication-operator-69f744f599-9zhhp\" (UID: \"71638eb4-fb1c-42be-84d6-d900ad27f196\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9zhhp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.486506 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/79fb293c-379a-47ac-a9b1-23746cb0758e-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-zw6lz\" (UID: \"79fb293c-379a-47ac-a9b1-23746cb0758e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.486511 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0fba000-75c8-49be-945e-fc41fabf370c-config\") pod \"controller-manager-879f6c89f-6xbj4\" (UID: \"b0fba000-75c8-49be-945e-fc41fabf370c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.486996 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/981371a6-bce6-4fc3-a2ea-2f4c0e26072c-config\") pod \"openshift-apiserver-operator-796bbdcf4f-xtkxw\" (UID: \"981371a6-bce6-4fc3-a2ea-2f4c0e26072c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtkxw" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.487016 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/79fb293c-379a-47ac-a9b1-23746cb0758e-audit-policies\") pod \"apiserver-7bbb656c7d-zw6lz\" (UID: \"79fb293c-379a-47ac-a9b1-23746cb0758e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.487481 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4134a970-5107-41d2-8fbe-336387a17b77-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-bmjgh\" (UID: \"4134a970-5107-41d2-8fbe-336387a17b77\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bmjgh" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.487597 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/b1482be7-a50a-43ca-974a-d49cad628e46-audit\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.487835 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/79fb293c-379a-47ac-a9b1-23746cb0758e-encryption-config\") pod \"apiserver-7bbb656c7d-zw6lz\" (UID: \"79fb293c-379a-47ac-a9b1-23746cb0758e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.487910 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4baeac0b-e75b-49b5-9206-4cd99a4764f6-trusted-ca\") pod \"console-operator-58897d9998-48d4r\" (UID: \"4baeac0b-e75b-49b5-9206-4cd99a4764f6\") " pod="openshift-console-operator/console-operator-58897d9998-48d4r" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.487936 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9f43c1a5-4813-44e6-b2b9-53b134283a59-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vggbp\" (UID: \"9f43c1a5-4813-44e6-b2b9-53b134283a59\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vggbp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.487960 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7lwn\" (UniqueName: \"kubernetes.io/projected/1ebf9d1e-e313-440d-992a-9e0ede5b2b24-kube-api-access-s7lwn\") pod \"machine-api-operator-5694c8668f-fqhrm\" (UID: \"1ebf9d1e-e313-440d-992a-9e0ede5b2b24\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fqhrm" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.487981 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4d0b3bb-c027-4390-92ac-66aad8bf0d19-secret-volume\") pod \"collect-profiles-29521980-llz2m\" (UID: \"f4d0b3bb-c027-4390-92ac-66aad8bf0d19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521980-llz2m" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488002 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488079 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrpcf\" (UniqueName: \"kubernetes.io/projected/038c87fd-172d-4e34-8927-92bafb47879a-kube-api-access-vrpcf\") pod \"openshift-controller-manager-operator-756b6f6bc6-7b6g5\" (UID: \"038c87fd-172d-4e34-8927-92bafb47879a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7b6g5" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488099 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b1482be7-a50a-43ca-974a-d49cad628e46-etcd-serving-ca\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488116 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7f794184-a546-44a9-ab3b-47d69b306384-profile-collector-cert\") pod \"catalog-operator-68c6474976-t68jc\" (UID: \"7f794184-a546-44a9-ab3b-47d69b306384\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t68jc" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488135 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7771668e-6442-46eb-a2a5-53fd35396ef4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4cgbp\" (UID: \"7771668e-6442-46eb-a2a5-53fd35396ef4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4cgbp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488158 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqns5\" (UniqueName: \"kubernetes.io/projected/71638eb4-fb1c-42be-84d6-d900ad27f196-kube-api-access-kqns5\") pod \"authentication-operator-69f744f599-9zhhp\" (UID: \"71638eb4-fb1c-42be-84d6-d900ad27f196\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9zhhp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488175 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87655950-5426-4cb0-a10b-f71b0fbb0549-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-tgzvl\" (UID: \"87655950-5426-4cb0-a10b-f71b0fbb0549\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgzvl" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488195 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d8cdbb3-b672-4984-8d03-562965a7b081-audit-dir\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488215 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a-config\") pod \"route-controller-manager-6576b87f9c-lwfvg\" (UID: \"2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488298 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9b13597-8879-40d4-965b-580222915295-console-oauth-config\") pod \"console-f9d7485db-9h2hf\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488316 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3d51261-dd68-44ef-9329-fdb8e325d504-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-47hsw\" (UID: \"a3d51261-dd68-44ef-9329-fdb8e325d504\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-47hsw" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488335 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b1482be7-a50a-43ca-974a-d49cad628e46-etcd-client\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488350 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f43c1a5-4813-44e6-b2b9-53b134283a59-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vggbp\" (UID: \"9f43c1a5-4813-44e6-b2b9-53b134283a59\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vggbp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488367 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htkx7\" (UniqueName: \"kubernetes.io/projected/7f794184-a546-44a9-ab3b-47d69b306384-kube-api-access-htkx7\") pod \"catalog-operator-68c6474976-t68jc\" (UID: \"7f794184-a546-44a9-ab3b-47d69b306384\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t68jc" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488386 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0fba000-75c8-49be-945e-fc41fabf370c-serving-cert\") pod \"controller-manager-879f6c89f-6xbj4\" (UID: \"b0fba000-75c8-49be-945e-fc41fabf370c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488404 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgbwd\" (UniqueName: \"kubernetes.io/projected/b0fba000-75c8-49be-945e-fc41fabf370c-kube-api-access-fgbwd\") pod \"controller-manager-879f6c89f-6xbj4\" (UID: \"b0fba000-75c8-49be-945e-fc41fabf370c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488420 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9b13597-8879-40d4-965b-580222915295-trusted-ca-bundle\") pod \"console-f9d7485db-9h2hf\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488437 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488454 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9b13597-8879-40d4-965b-580222915295-oauth-serving-cert\") pod \"console-f9d7485db-9h2hf\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488474 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcq6f\" (UniqueName: \"kubernetes.io/projected/7771668e-6442-46eb-a2a5-53fd35396ef4-kube-api-access-gcq6f\") pod \"machine-config-controller-84d6567774-4cgbp\" (UID: \"7771668e-6442-46eb-a2a5-53fd35396ef4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4cgbp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488491 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkpmt\" (UniqueName: \"kubernetes.io/projected/87655950-5426-4cb0-a10b-f71b0fbb0549-kube-api-access-pkpmt\") pod \"kube-storage-version-migrator-operator-b67b599dd-tgzvl\" (UID: \"87655950-5426-4cb0-a10b-f71b0fbb0549\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgzvl" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488511 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ebf9d1e-e313-440d-992a-9e0ede5b2b24-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fqhrm\" (UID: \"1ebf9d1e-e313-440d-992a-9e0ede5b2b24\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fqhrm" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488527 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3afb3ce4-f468-4042-b4d5-61285893e7e1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mq695\" (UID: \"3afb3ce4-f468-4042-b4d5-61285893e7e1\") " pod="openshift-marketplace/marketplace-operator-79b997595-mq695" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488547 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/414280b4-3299-4fee-a33c-a231b66000c7-service-ca-bundle\") pod \"router-default-5444994796-gxh7z\" (UID: \"414280b4-3299-4fee-a33c-a231b66000c7\") " pod="openshift-ingress/router-default-5444994796-gxh7z" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488590 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtt5h\" (UniqueName: \"kubernetes.io/projected/4baeac0b-e75b-49b5-9206-4cd99a4764f6-kube-api-access-jtt5h\") pod \"console-operator-58897d9998-48d4r\" (UID: \"4baeac0b-e75b-49b5-9206-4cd99a4764f6\") " pod="openshift-console-operator/console-operator-58897d9998-48d4r" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488616 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rkfw\" (UniqueName: \"kubernetes.io/projected/b1482be7-a50a-43ca-974a-d49cad628e46-kube-api-access-9rkfw\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488634 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79fb293c-379a-47ac-a9b1-23746cb0758e-serving-cert\") pod \"apiserver-7bbb656c7d-zw6lz\" (UID: \"79fb293c-379a-47ac-a9b1-23746cb0758e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488774 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488799 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plmfj\" (UniqueName: \"kubernetes.io/projected/834efefd-4b1f-45e3-9085-6c0dab5f4870-kube-api-access-plmfj\") pod \"service-ca-9c57cc56f-t8l8g\" (UID: \"834efefd-4b1f-45e3-9085-6c0dab5f4870\") " pod="openshift-service-ca/service-ca-9c57cc56f-t8l8g" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488819 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/414280b4-3299-4fee-a33c-a231b66000c7-default-certificate\") pod \"router-default-5444994796-gxh7z\" (UID: \"414280b4-3299-4fee-a33c-a231b66000c7\") " pod="openshift-ingress/router-default-5444994796-gxh7z" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488840 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c546b6c9-bd0c-4a50-9521-49af89f3ede1-serving-cert\") pod \"openshift-config-operator-7777fb866f-zhzdb\" (UID: \"c546b6c9-bd0c-4a50-9521-49af89f3ede1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhzdb" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488856 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4134a970-5107-41d2-8fbe-336387a17b77-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-bmjgh\" (UID: \"4134a970-5107-41d2-8fbe-336387a17b77\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bmjgh" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.488873 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.490210 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b1482be7-a50a-43ca-974a-d49cad628e46-etcd-serving-ca\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.490577 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.490670 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d8cdbb3-b672-4984-8d03-562965a7b081-audit-dir\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.490692 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4baeac0b-e75b-49b5-9206-4cd99a4764f6-trusted-ca\") pod \"console-operator-58897d9998-48d4r\" (UID: \"4baeac0b-e75b-49b5-9206-4cd99a4764f6\") " pod="openshift-console-operator/console-operator-58897d9998-48d4r" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.490724 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9b13597-8879-40d4-965b-580222915295-oauth-serving-cert\") pod \"console-f9d7485db-9h2hf\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.491506 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.492504 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b1482be7-a50a-43ca-974a-d49cad628e46-encryption-config\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.492607 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.492655 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a-config\") pod \"route-controller-manager-6576b87f9c-lwfvg\" (UID: \"2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.492924 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9b13597-8879-40d4-965b-580222915295-trusted-ca-bundle\") pod \"console-f9d7485db-9h2hf\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.492925 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.493008 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.493146 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9b13597-8879-40d4-965b-580222915295-console-oauth-config\") pod \"console-f9d7485db-9h2hf\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.493567 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.493588 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.493931 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/79fb293c-379a-47ac-a9b1-23746cb0758e-etcd-client\") pod \"apiserver-7bbb656c7d-zw6lz\" (UID: \"79fb293c-379a-47ac-a9b1-23746cb0758e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.494074 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.494230 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/981371a6-bce6-4fc3-a2ea-2f4c0e26072c-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-xtkxw\" (UID: \"981371a6-bce6-4fc3-a2ea-2f4c0e26072c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtkxw" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.494651 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a32bd13d-f885-4653-8868-89fc4a8ac111-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-ph5wk\" (UID: \"a32bd13d-f885-4653-8868-89fc4a8ac111\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ph5wk" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.494752 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7f794184-a546-44a9-ab3b-47d69b306384-srv-cert\") pod \"catalog-operator-68c6474976-t68jc\" (UID: \"7f794184-a546-44a9-ab3b-47d69b306384\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t68jc" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.495136 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0fba000-75c8-49be-945e-fc41fabf370c-serving-cert\") pod \"controller-manager-879f6c89f-6xbj4\" (UID: \"b0fba000-75c8-49be-945e-fc41fabf370c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.495427 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c546b6c9-bd0c-4a50-9521-49af89f3ede1-serving-cert\") pod \"openshift-config-operator-7777fb866f-zhzdb\" (UID: \"c546b6c9-bd0c-4a50-9521-49af89f3ede1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhzdb" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.495820 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ebf9d1e-e313-440d-992a-9e0ede5b2b24-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-fqhrm\" (UID: \"1ebf9d1e-e313-440d-992a-9e0ede5b2b24\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fqhrm" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.495833 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b1482be7-a50a-43ca-974a-d49cad628e46-etcd-client\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.496349 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a-serving-cert\") pod \"route-controller-manager-6576b87f9c-lwfvg\" (UID: \"2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.496737 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9b13597-8879-40d4-965b-580222915295-console-serving-cert\") pod \"console-f9d7485db-9h2hf\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.498095 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/4134a970-5107-41d2-8fbe-336387a17b77-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-bmjgh\" (UID: \"4134a970-5107-41d2-8fbe-336387a17b77\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bmjgh" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.498564 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/038c87fd-172d-4e34-8927-92bafb47879a-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-7b6g5\" (UID: \"038c87fd-172d-4e34-8927-92bafb47879a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7b6g5" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.499180 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.500025 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.502508 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7f794184-a546-44a9-ab3b-47d69b306384-profile-collector-cert\") pod \"catalog-operator-68c6474976-t68jc\" (UID: \"7f794184-a546-44a9-ab3b-47d69b306384\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t68jc" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.511680 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.531417 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.571325 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.581992 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f43c1a5-4813-44e6-b2b9-53b134283a59-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vggbp\" (UID: \"9f43c1a5-4813-44e6-b2b9-53b134283a59\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vggbp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.590468 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4d0b3bb-c027-4390-92ac-66aad8bf0d19-secret-volume\") pod \"collect-profiles-29521980-llz2m\" (UID: \"f4d0b3bb-c027-4390-92ac-66aad8bf0d19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521980-llz2m" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.590565 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7771668e-6442-46eb-a2a5-53fd35396ef4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4cgbp\" (UID: \"7771668e-6442-46eb-a2a5-53fd35396ef4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4cgbp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.590639 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87655950-5426-4cb0-a10b-f71b0fbb0549-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-tgzvl\" (UID: \"87655950-5426-4cb0-a10b-f71b0fbb0549\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgzvl" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.590718 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3d51261-dd68-44ef-9329-fdb8e325d504-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-47hsw\" (UID: \"a3d51261-dd68-44ef-9329-fdb8e325d504\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-47hsw" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.590818 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcq6f\" (UniqueName: \"kubernetes.io/projected/7771668e-6442-46eb-a2a5-53fd35396ef4-kube-api-access-gcq6f\") pod \"machine-config-controller-84d6567774-4cgbp\" (UID: \"7771668e-6442-46eb-a2a5-53fd35396ef4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4cgbp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.590895 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkpmt\" (UniqueName: \"kubernetes.io/projected/87655950-5426-4cb0-a10b-f71b0fbb0549-kube-api-access-pkpmt\") pod \"kube-storage-version-migrator-operator-b67b599dd-tgzvl\" (UID: \"87655950-5426-4cb0-a10b-f71b0fbb0549\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgzvl" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.590968 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3afb3ce4-f468-4042-b4d5-61285893e7e1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mq695\" (UID: \"3afb3ce4-f468-4042-b4d5-61285893e7e1\") " pod="openshift-marketplace/marketplace-operator-79b997595-mq695" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.591060 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plmfj\" (UniqueName: \"kubernetes.io/projected/834efefd-4b1f-45e3-9085-6c0dab5f4870-kube-api-access-plmfj\") pod \"service-ca-9c57cc56f-t8l8g\" (UID: \"834efefd-4b1f-45e3-9085-6c0dab5f4870\") " pod="openshift-service-ca/service-ca-9c57cc56f-t8l8g" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.591137 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gcxwv\" (UniqueName: \"kubernetes.io/projected/f4d0b3bb-c027-4390-92ac-66aad8bf0d19-kube-api-access-gcxwv\") pod \"collect-profiles-29521980-llz2m\" (UID: \"f4d0b3bb-c027-4390-92ac-66aad8bf0d19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521980-llz2m" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.591212 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87655950-5426-4cb0-a10b-f71b0fbb0549-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-tgzvl\" (UID: \"87655950-5426-4cb0-a10b-f71b0fbb0549\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgzvl" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.591295 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/834efefd-4b1f-45e3-9085-6c0dab5f4870-signing-key\") pod \"service-ca-9c57cc56f-t8l8g\" (UID: \"834efefd-4b1f-45e3-9085-6c0dab5f4870\") " pod="openshift-service-ca/service-ca-9c57cc56f-t8l8g" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.591225 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.591363 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5ece5ca-10cc-4597-b14e-60eb9091c7ff-serving-cert\") pod \"etcd-operator-b45778765-6rbxt\" (UID: \"d5ece5ca-10cc-4597-b14e-60eb9091c7ff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6rbxt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.591467 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/73e08276-9d2b-4ff7-a484-f19bfc28a9ec-metrics-tls\") pod \"dns-operator-744455d44c-c7jxs\" (UID: \"73e08276-9d2b-4ff7-a484-f19bfc28a9ec\") " pod="openshift-dns-operator/dns-operator-744455d44c-c7jxs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.591523 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d5ece5ca-10cc-4597-b14e-60eb9091c7ff-etcd-ca\") pod \"etcd-operator-b45778765-6rbxt\" (UID: \"d5ece5ca-10cc-4597-b14e-60eb9091c7ff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6rbxt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.591554 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4d0b3bb-c027-4390-92ac-66aad8bf0d19-config-volume\") pod \"collect-profiles-29521980-llz2m\" (UID: \"f4d0b3bb-c027-4390-92ac-66aad8bf0d19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521980-llz2m" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.591595 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3d51261-dd68-44ef-9329-fdb8e325d504-config\") pod \"kube-controller-manager-operator-78b949d7b-47hsw\" (UID: \"a3d51261-dd68-44ef-9329-fdb8e325d504\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-47hsw" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.591622 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dd116ca8-9aec-4817-a7b2-a242e01a0a2e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xdqnq\" (UID: \"dd116ca8-9aec-4817-a7b2-a242e01a0a2e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xdqnq" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.591661 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3d51261-dd68-44ef-9329-fdb8e325d504-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-47hsw\" (UID: \"a3d51261-dd68-44ef-9329-fdb8e325d504\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-47hsw" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.591713 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b26097da-27bf-41ed-9010-8a967d0bc173-serving-cert\") pod \"service-ca-operator-777779d784-86pgj\" (UID: \"b26097da-27bf-41ed-9010-8a967d0bc173\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-86pgj" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.591739 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b02fa3f4-70d0-4ea7-8359-5cb6611ef778-node-bootstrap-token\") pod \"machine-config-server-jsjk5\" (UID: \"b02fa3f4-70d0-4ea7-8359-5cb6611ef778\") " pod="openshift-machine-config-operator/machine-config-server-jsjk5" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.591783 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd116ca8-9aec-4817-a7b2-a242e01a0a2e-proxy-tls\") pod \"machine-config-operator-74547568cd-xdqnq\" (UID: \"dd116ca8-9aec-4817-a7b2-a242e01a0a2e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xdqnq" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.591810 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8v62\" (UniqueName: \"kubernetes.io/projected/d5ece5ca-10cc-4597-b14e-60eb9091c7ff-kube-api-access-f8v62\") pod \"etcd-operator-b45778765-6rbxt\" (UID: \"d5ece5ca-10cc-4597-b14e-60eb9091c7ff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6rbxt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.591859 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd27b6bd-2fc3-4fb3-8187-d08fccb41c82-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wlh58\" (UID: \"cd27b6bd-2fc3-4fb3-8187-d08fccb41c82\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wlh58" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.591885 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5ece5ca-10cc-4597-b14e-60eb9091c7ff-config\") pod \"etcd-operator-b45778765-6rbxt\" (UID: \"d5ece5ca-10cc-4597-b14e-60eb9091c7ff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6rbxt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.591932 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b02fa3f4-70d0-4ea7-8359-5cb6611ef778-certs\") pod \"machine-config-server-jsjk5\" (UID: \"b02fa3f4-70d0-4ea7-8359-5cb6611ef778\") " pod="openshift-machine-config-operator/machine-config-server-jsjk5" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.592076 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7771668e-6442-46eb-a2a5-53fd35396ef4-proxy-tls\") pod \"machine-config-controller-84d6567774-4cgbp\" (UID: \"7771668e-6442-46eb-a2a5-53fd35396ef4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4cgbp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.592115 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzpfb\" (UniqueName: \"kubernetes.io/projected/6e53ded1-38a3-4129-bdfb-e0ef6ca1b748-kube-api-access-tzpfb\") pod \"multus-admission-controller-857f4d67dd-q62cm\" (UID: \"6e53ded1-38a3-4129-bdfb-e0ef6ca1b748\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-q62cm" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.592143 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd27b6bd-2fc3-4fb3-8187-d08fccb41c82-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wlh58\" (UID: \"cd27b6bd-2fc3-4fb3-8187-d08fccb41c82\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wlh58" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.592187 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd27b6bd-2fc3-4fb3-8187-d08fccb41c82-config\") pod \"kube-apiserver-operator-766d6c64bb-wlh58\" (UID: \"cd27b6bd-2fc3-4fb3-8187-d08fccb41c82\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wlh58" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.592222 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/dd116ca8-9aec-4817-a7b2-a242e01a0a2e-images\") pod \"machine-config-operator-74547568cd-xdqnq\" (UID: \"dd116ca8-9aec-4817-a7b2-a242e01a0a2e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xdqnq" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.592248 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wq94s\" (UniqueName: \"kubernetes.io/projected/3afb3ce4-f468-4042-b4d5-61285893e7e1-kube-api-access-wq94s\") pod \"marketplace-operator-79b997595-mq695\" (UID: \"3afb3ce4-f468-4042-b4d5-61285893e7e1\") " pod="openshift-marketplace/marketplace-operator-79b997595-mq695" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.592270 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d5ece5ca-10cc-4597-b14e-60eb9091c7ff-etcd-client\") pod \"etcd-operator-b45778765-6rbxt\" (UID: \"d5ece5ca-10cc-4597-b14e-60eb9091c7ff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6rbxt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.592309 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8vl6\" (UniqueName: \"kubernetes.io/projected/73e08276-9d2b-4ff7-a484-f19bfc28a9ec-kube-api-access-x8vl6\") pod \"dns-operator-744455d44c-c7jxs\" (UID: \"73e08276-9d2b-4ff7-a484-f19bfc28a9ec\") " pod="openshift-dns-operator/dns-operator-744455d44c-c7jxs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.592335 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b26097da-27bf-41ed-9010-8a967d0bc173-config\") pod \"service-ca-operator-777779d784-86pgj\" (UID: \"b26097da-27bf-41ed-9010-8a967d0bc173\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-86pgj" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.592366 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/834efefd-4b1f-45e3-9085-6c0dab5f4870-signing-cabundle\") pod \"service-ca-9c57cc56f-t8l8g\" (UID: \"834efefd-4b1f-45e3-9085-6c0dab5f4870\") " pod="openshift-service-ca/service-ca-9c57cc56f-t8l8g" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.592410 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztfzt\" (UniqueName: \"kubernetes.io/projected/b26097da-27bf-41ed-9010-8a967d0bc173-kube-api-access-ztfzt\") pod \"service-ca-operator-777779d784-86pgj\" (UID: \"b26097da-27bf-41ed-9010-8a967d0bc173\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-86pgj" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.592455 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d5ece5ca-10cc-4597-b14e-60eb9091c7ff-etcd-service-ca\") pod \"etcd-operator-b45778765-6rbxt\" (UID: \"d5ece5ca-10cc-4597-b14e-60eb9091c7ff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6rbxt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.592489 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dfvz\" (UniqueName: \"kubernetes.io/projected/b02fa3f4-70d0-4ea7-8359-5cb6611ef778-kube-api-access-5dfvz\") pod \"machine-config-server-jsjk5\" (UID: \"b02fa3f4-70d0-4ea7-8359-5cb6611ef778\") " pod="openshift-machine-config-operator/machine-config-server-jsjk5" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.592527 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6e53ded1-38a3-4129-bdfb-e0ef6ca1b748-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-q62cm\" (UID: \"6e53ded1-38a3-4129-bdfb-e0ef6ca1b748\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-q62cm" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.592571 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3afb3ce4-f468-4042-b4d5-61285893e7e1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mq695\" (UID: \"3afb3ce4-f468-4042-b4d5-61285893e7e1\") " pod="openshift-marketplace/marketplace-operator-79b997595-mq695" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.592603 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9gx9\" (UniqueName: \"kubernetes.io/projected/dd116ca8-9aec-4817-a7b2-a242e01a0a2e-kube-api-access-p9gx9\") pod \"machine-config-operator-74547568cd-xdqnq\" (UID: \"dd116ca8-9aec-4817-a7b2-a242e01a0a2e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xdqnq" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.592701 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dd116ca8-9aec-4817-a7b2-a242e01a0a2e-auth-proxy-config\") pod \"machine-config-operator-74547568cd-xdqnq\" (UID: \"dd116ca8-9aec-4817-a7b2-a242e01a0a2e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xdqnq" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.593262 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7771668e-6442-46eb-a2a5-53fd35396ef4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-4cgbp\" (UID: \"7771668e-6442-46eb-a2a5-53fd35396ef4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4cgbp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.594292 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4d0b3bb-c027-4390-92ac-66aad8bf0d19-secret-volume\") pod \"collect-profiles-29521980-llz2m\" (UID: \"f4d0b3bb-c027-4390-92ac-66aad8bf0d19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521980-llz2m" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.611732 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.620460 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/f7b4b17a-ac26-4e3a-8f67-d0c855b6aa95-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-4cw7d\" (UID: \"f7b4b17a-ac26-4e3a-8f67-d0c855b6aa95\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4cw7d" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.631938 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.651963 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.671587 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.680714 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/414280b4-3299-4fee-a33c-a231b66000c7-service-ca-bundle\") pod \"router-default-5444994796-gxh7z\" (UID: \"414280b4-3299-4fee-a33c-a231b66000c7\") " pod="openshift-ingress/router-default-5444994796-gxh7z" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.691998 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.711573 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.722777 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f43c1a5-4813-44e6-b2b9-53b134283a59-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vggbp\" (UID: \"9f43c1a5-4813-44e6-b2b9-53b134283a59\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vggbp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.733007 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.754222 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.767835 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/414280b4-3299-4fee-a33c-a231b66000c7-default-certificate\") pod \"router-default-5444994796-gxh7z\" (UID: \"414280b4-3299-4fee-a33c-a231b66000c7\") " pod="openshift-ingress/router-default-5444994796-gxh7z" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.771516 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.781933 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/414280b4-3299-4fee-a33c-a231b66000c7-stats-auth\") pod \"router-default-5444994796-gxh7z\" (UID: \"414280b4-3299-4fee-a33c-a231b66000c7\") " pod="openshift-ingress/router-default-5444994796-gxh7z" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.792536 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.804260 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/414280b4-3299-4fee-a33c-a231b66000c7-metrics-certs\") pod \"router-default-5444994796-gxh7z\" (UID: \"414280b4-3299-4fee-a33c-a231b66000c7\") " pod="openshift-ingress/router-default-5444994796-gxh7z" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.811508 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.832101 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.836407 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7771668e-6442-46eb-a2a5-53fd35396ef4-proxy-tls\") pod \"machine-config-controller-84d6567774-4cgbp\" (UID: \"7771668e-6442-46eb-a2a5-53fd35396ef4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4cgbp" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.851647 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.872208 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.892127 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.912622 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.926650 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/73e08276-9d2b-4ff7-a484-f19bfc28a9ec-metrics-tls\") pod \"dns-operator-744455d44c-c7jxs\" (UID: \"73e08276-9d2b-4ff7-a484-f19bfc28a9ec\") " pod="openshift-dns-operator/dns-operator-744455d44c-c7jxs" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.932051 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.952460 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.971872 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.986786 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a3d51261-dd68-44ef-9329-fdb8e325d504-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-47hsw\" (UID: \"a3d51261-dd68-44ef-9329-fdb8e325d504\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-47hsw" Feb 17 09:07:42 crc kubenswrapper[4848]: I0217 09:07:42.992255 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.003140 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3d51261-dd68-44ef-9329-fdb8e325d504-config\") pod \"kube-controller-manager-operator-78b949d7b-47hsw\" (UID: \"a3d51261-dd68-44ef-9329-fdb8e325d504\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-47hsw" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.013213 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.032330 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.051910 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.072101 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.101843 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.112511 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.132053 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.152476 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.172906 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.188101 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cd27b6bd-2fc3-4fb3-8187-d08fccb41c82-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wlh58\" (UID: \"cd27b6bd-2fc3-4fb3-8187-d08fccb41c82\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wlh58" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.192363 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.193531 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cd27b6bd-2fc3-4fb3-8187-d08fccb41c82-config\") pod \"kube-apiserver-operator-766d6c64bb-wlh58\" (UID: \"cd27b6bd-2fc3-4fb3-8187-d08fccb41c82\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wlh58" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.211532 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.232039 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.252920 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.271959 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.292469 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.312630 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.327510 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5ece5ca-10cc-4597-b14e-60eb9091c7ff-serving-cert\") pod \"etcd-operator-b45778765-6rbxt\" (UID: \"d5ece5ca-10cc-4597-b14e-60eb9091c7ff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6rbxt" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.332130 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.351960 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.356431 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/d5ece5ca-10cc-4597-b14e-60eb9091c7ff-etcd-client\") pod \"etcd-operator-b45778765-6rbxt\" (UID: \"d5ece5ca-10cc-4597-b14e-60eb9091c7ff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6rbxt" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.372239 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.374056 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/d5ece5ca-10cc-4597-b14e-60eb9091c7ff-etcd-service-ca\") pod \"etcd-operator-b45778765-6rbxt\" (UID: \"d5ece5ca-10cc-4597-b14e-60eb9091c7ff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6rbxt" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.389974 4848 request.go:700] Waited for 1.017697568s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/configmaps?fieldSelector=metadata.name%3Detcd-operator-config&limit=500&resourceVersion=0 Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.392740 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.403560 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5ece5ca-10cc-4597-b14e-60eb9091c7ff-config\") pod \"etcd-operator-b45778765-6rbxt\" (UID: \"d5ece5ca-10cc-4597-b14e-60eb9091c7ff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6rbxt" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.411078 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.412650 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/d5ece5ca-10cc-4597-b14e-60eb9091c7ff-etcd-ca\") pod \"etcd-operator-b45778765-6rbxt\" (UID: \"d5ece5ca-10cc-4597-b14e-60eb9091c7ff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6rbxt" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.432107 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.452685 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.457486 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6e53ded1-38a3-4129-bdfb-e0ef6ca1b748-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-q62cm\" (UID: \"6e53ded1-38a3-4129-bdfb-e0ef6ca1b748\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-q62cm" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.485082 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.491319 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.495067 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3afb3ce4-f468-4042-b4d5-61285893e7e1-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-mq695\" (UID: \"3afb3ce4-f468-4042-b4d5-61285893e7e1\") " pod="openshift-marketplace/marketplace-operator-79b997595-mq695" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.511401 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.515470 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3afb3ce4-f468-4042-b4d5-61285893e7e1-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-mq695\" (UID: \"3afb3ce4-f468-4042-b4d5-61285893e7e1\") " pod="openshift-marketplace/marketplace-operator-79b997595-mq695" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.532121 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.552598 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.572727 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 17 09:07:43 crc kubenswrapper[4848]: E0217 09:07:43.591241 4848 secret.go:188] Couldn't get secret openshift-kube-storage-version-migrator-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 17 09:07:43 crc kubenswrapper[4848]: E0217 09:07:43.591340 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/87655950-5426-4cb0-a10b-f71b0fbb0549-serving-cert podName:87655950-5426-4cb0-a10b-f71b0fbb0549 nodeName:}" failed. No retries permitted until 2026-02-17 09:07:44.091312779 +0000 UTC m=+141.634568425 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/87655950-5426-4cb0-a10b-f71b0fbb0549-serving-cert") pod "kube-storage-version-migrator-operator-b67b599dd-tgzvl" (UID: "87655950-5426-4cb0-a10b-f71b0fbb0549") : failed to sync secret cache: timed out waiting for the condition Feb 17 09:07:43 crc kubenswrapper[4848]: E0217 09:07:43.591466 4848 configmap.go:193] Couldn't get configMap openshift-kube-storage-version-migrator-operator/config: failed to sync configmap cache: timed out waiting for the condition Feb 17 09:07:43 crc kubenswrapper[4848]: E0217 09:07:43.591473 4848 secret.go:188] Couldn't get secret openshift-service-ca/signing-key: failed to sync secret cache: timed out waiting for the condition Feb 17 09:07:43 crc kubenswrapper[4848]: E0217 09:07:43.591643 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/87655950-5426-4cb0-a10b-f71b0fbb0549-config podName:87655950-5426-4cb0-a10b-f71b0fbb0549 nodeName:}" failed. No retries permitted until 2026-02-17 09:07:44.091559276 +0000 UTC m=+141.634814962 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/87655950-5426-4cb0-a10b-f71b0fbb0549-config") pod "kube-storage-version-migrator-operator-b67b599dd-tgzvl" (UID: "87655950-5426-4cb0-a10b-f71b0fbb0549") : failed to sync configmap cache: timed out waiting for the condition Feb 17 09:07:43 crc kubenswrapper[4848]: E0217 09:07:43.591690 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/834efefd-4b1f-45e3-9085-6c0dab5f4870-signing-key podName:834efefd-4b1f-45e3-9085-6c0dab5f4870 nodeName:}" failed. No retries permitted until 2026-02-17 09:07:44.091663539 +0000 UTC m=+141.634919275 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-key" (UniqueName: "kubernetes.io/secret/834efefd-4b1f-45e3-9085-6c0dab5f4870-signing-key") pod "service-ca-9c57cc56f-t8l8g" (UID: "834efefd-4b1f-45e3-9085-6c0dab5f4870") : failed to sync secret cache: timed out waiting for the condition Feb 17 09:07:43 crc kubenswrapper[4848]: E0217 09:07:43.591882 4848 secret.go:188] Couldn't get secret openshift-service-ca-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Feb 17 09:07:43 crc kubenswrapper[4848]: E0217 09:07:43.591897 4848 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Feb 17 09:07:43 crc kubenswrapper[4848]: E0217 09:07:43.591957 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b26097da-27bf-41ed-9010-8a967d0bc173-serving-cert podName:b26097da-27bf-41ed-9010-8a967d0bc173 nodeName:}" failed. No retries permitted until 2026-02-17 09:07:44.091933006 +0000 UTC m=+141.635188712 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b26097da-27bf-41ed-9010-8a967d0bc173-serving-cert") pod "service-ca-operator-777779d784-86pgj" (UID: "b26097da-27bf-41ed-9010-8a967d0bc173") : failed to sync secret cache: timed out waiting for the condition Feb 17 09:07:43 crc kubenswrapper[4848]: E0217 09:07:43.591990 4848 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Feb 17 09:07:43 crc kubenswrapper[4848]: E0217 09:07:43.591993 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f4d0b3bb-c027-4390-92ac-66aad8bf0d19-config-volume podName:f4d0b3bb-c027-4390-92ac-66aad8bf0d19 nodeName:}" failed. No retries permitted until 2026-02-17 09:07:44.091977048 +0000 UTC m=+141.635232754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/f4d0b3bb-c027-4390-92ac-66aad8bf0d19-config-volume") pod "collect-profiles-29521980-llz2m" (UID: "f4d0b3bb-c027-4390-92ac-66aad8bf0d19") : failed to sync configmap cache: timed out waiting for the condition Feb 17 09:07:43 crc kubenswrapper[4848]: E0217 09:07:43.592086 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b02fa3f4-70d0-4ea7-8359-5cb6611ef778-node-bootstrap-token podName:b02fa3f4-70d0-4ea7-8359-5cb6611ef778 nodeName:}" failed. No retries permitted until 2026-02-17 09:07:44.09206592 +0000 UTC m=+141.635321596 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/b02fa3f4-70d0-4ea7-8359-5cb6611ef778-node-bootstrap-token") pod "machine-config-server-jsjk5" (UID: "b02fa3f4-70d0-4ea7-8359-5cb6611ef778") : failed to sync secret cache: timed out waiting for the condition Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.592376 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 17 09:07:43 crc kubenswrapper[4848]: E0217 09:07:43.592983 4848 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Feb 17 09:07:43 crc kubenswrapper[4848]: E0217 09:07:43.593029 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b02fa3f4-70d0-4ea7-8359-5cb6611ef778-certs podName:b02fa3f4-70d0-4ea7-8359-5cb6611ef778 nodeName:}" failed. No retries permitted until 2026-02-17 09:07:44.093018166 +0000 UTC m=+141.636273912 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/b02fa3f4-70d0-4ea7-8359-5cb6611ef778-certs") pod "machine-config-server-jsjk5" (UID: "b02fa3f4-70d0-4ea7-8359-5cb6611ef778") : failed to sync secret cache: timed out waiting for the condition Feb 17 09:07:43 crc kubenswrapper[4848]: E0217 09:07:43.593237 4848 configmap.go:193] Couldn't get configMap openshift-service-ca/signing-cabundle: failed to sync configmap cache: timed out waiting for the condition Feb 17 09:07:43 crc kubenswrapper[4848]: E0217 09:07:43.593276 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/834efefd-4b1f-45e3-9085-6c0dab5f4870-signing-cabundle podName:834efefd-4b1f-45e3-9085-6c0dab5f4870 nodeName:}" failed. No retries permitted until 2026-02-17 09:07:44.093263803 +0000 UTC m=+141.636519549 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "signing-cabundle" (UniqueName: "kubernetes.io/configmap/834efefd-4b1f-45e3-9085-6c0dab5f4870-signing-cabundle") pod "service-ca-9c57cc56f-t8l8g" (UID: "834efefd-4b1f-45e3-9085-6c0dab5f4870") : failed to sync configmap cache: timed out waiting for the condition Feb 17 09:07:43 crc kubenswrapper[4848]: E0217 09:07:43.593301 4848 secret.go:188] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Feb 17 09:07:43 crc kubenswrapper[4848]: E0217 09:07:43.593342 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd116ca8-9aec-4817-a7b2-a242e01a0a2e-proxy-tls podName:dd116ca8-9aec-4817-a7b2-a242e01a0a2e nodeName:}" failed. No retries permitted until 2026-02-17 09:07:44.093331265 +0000 UTC m=+141.636587021 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/dd116ca8-9aec-4817-a7b2-a242e01a0a2e-proxy-tls") pod "machine-config-operator-74547568cd-xdqnq" (UID: "dd116ca8-9aec-4817-a7b2-a242e01a0a2e") : failed to sync secret cache: timed out waiting for the condition Feb 17 09:07:43 crc kubenswrapper[4848]: E0217 09:07:43.593459 4848 configmap.go:193] Couldn't get configMap openshift-service-ca-operator/service-ca-operator-config: failed to sync configmap cache: timed out waiting for the condition Feb 17 09:07:43 crc kubenswrapper[4848]: E0217 09:07:43.593889 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b26097da-27bf-41ed-9010-8a967d0bc173-config podName:b26097da-27bf-41ed-9010-8a967d0bc173 nodeName:}" failed. No retries permitted until 2026-02-17 09:07:44.093693635 +0000 UTC m=+141.636949341 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/b26097da-27bf-41ed-9010-8a967d0bc173-config") pod "service-ca-operator-777779d784-86pgj" (UID: "b26097da-27bf-41ed-9010-8a967d0bc173") : failed to sync configmap cache: timed out waiting for the condition Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.594459 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/dd116ca8-9aec-4817-a7b2-a242e01a0a2e-images\") pod \"machine-config-operator-74547568cd-xdqnq\" (UID: \"dd116ca8-9aec-4817-a7b2-a242e01a0a2e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xdqnq" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.616220 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.631817 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.652192 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.671059 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.691932 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.711253 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.733248 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.752287 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.772090 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.792658 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.812908 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.832179 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.851278 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.871313 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.892094 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.912271 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.933034 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.951831 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.972286 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 17 09:07:43 crc kubenswrapper[4848]: I0217 09:07:43.991850 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.012672 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.032343 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.052848 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.072441 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.093081 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.113168 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.117301 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87655950-5426-4cb0-a10b-f71b0fbb0549-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-tgzvl\" (UID: \"87655950-5426-4cb0-a10b-f71b0fbb0549\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgzvl" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.117423 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87655950-5426-4cb0-a10b-f71b0fbb0549-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-tgzvl\" (UID: \"87655950-5426-4cb0-a10b-f71b0fbb0549\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgzvl" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.117452 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/834efefd-4b1f-45e3-9085-6c0dab5f4870-signing-key\") pod \"service-ca-9c57cc56f-t8l8g\" (UID: \"834efefd-4b1f-45e3-9085-6c0dab5f4870\") " pod="openshift-service-ca/service-ca-9c57cc56f-t8l8g" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.117474 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4d0b3bb-c027-4390-92ac-66aad8bf0d19-config-volume\") pod \"collect-profiles-29521980-llz2m\" (UID: \"f4d0b3bb-c027-4390-92ac-66aad8bf0d19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521980-llz2m" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.117544 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b26097da-27bf-41ed-9010-8a967d0bc173-serving-cert\") pod \"service-ca-operator-777779d784-86pgj\" (UID: \"b26097da-27bf-41ed-9010-8a967d0bc173\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-86pgj" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.117565 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b02fa3f4-70d0-4ea7-8359-5cb6611ef778-node-bootstrap-token\") pod \"machine-config-server-jsjk5\" (UID: \"b02fa3f4-70d0-4ea7-8359-5cb6611ef778\") " pod="openshift-machine-config-operator/machine-config-server-jsjk5" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.117599 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd116ca8-9aec-4817-a7b2-a242e01a0a2e-proxy-tls\") pod \"machine-config-operator-74547568cd-xdqnq\" (UID: \"dd116ca8-9aec-4817-a7b2-a242e01a0a2e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xdqnq" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.117657 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b02fa3f4-70d0-4ea7-8359-5cb6611ef778-certs\") pod \"machine-config-server-jsjk5\" (UID: \"b02fa3f4-70d0-4ea7-8359-5cb6611ef778\") " pod="openshift-machine-config-operator/machine-config-server-jsjk5" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.117735 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b26097da-27bf-41ed-9010-8a967d0bc173-config\") pod \"service-ca-operator-777779d784-86pgj\" (UID: \"b26097da-27bf-41ed-9010-8a967d0bc173\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-86pgj" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.117810 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/834efefd-4b1f-45e3-9085-6c0dab5f4870-signing-cabundle\") pod \"service-ca-9c57cc56f-t8l8g\" (UID: \"834efefd-4b1f-45e3-9085-6c0dab5f4870\") " pod="openshift-service-ca/service-ca-9c57cc56f-t8l8g" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.118887 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/834efefd-4b1f-45e3-9085-6c0dab5f4870-signing-cabundle\") pod \"service-ca-9c57cc56f-t8l8g\" (UID: \"834efefd-4b1f-45e3-9085-6c0dab5f4870\") " pod="openshift-service-ca/service-ca-9c57cc56f-t8l8g" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.118968 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b26097da-27bf-41ed-9010-8a967d0bc173-config\") pod \"service-ca-operator-777779d784-86pgj\" (UID: \"b26097da-27bf-41ed-9010-8a967d0bc173\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-86pgj" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.119119 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87655950-5426-4cb0-a10b-f71b0fbb0549-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-tgzvl\" (UID: \"87655950-5426-4cb0-a10b-f71b0fbb0549\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgzvl" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.119228 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4d0b3bb-c027-4390-92ac-66aad8bf0d19-config-volume\") pod \"collect-profiles-29521980-llz2m\" (UID: \"f4d0b3bb-c027-4390-92ac-66aad8bf0d19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521980-llz2m" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.121261 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dd116ca8-9aec-4817-a7b2-a242e01a0a2e-proxy-tls\") pod \"machine-config-operator-74547568cd-xdqnq\" (UID: \"dd116ca8-9aec-4817-a7b2-a242e01a0a2e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xdqnq" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.121303 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b26097da-27bf-41ed-9010-8a967d0bc173-serving-cert\") pod \"service-ca-operator-777779d784-86pgj\" (UID: \"b26097da-27bf-41ed-9010-8a967d0bc173\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-86pgj" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.123440 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/834efefd-4b1f-45e3-9085-6c0dab5f4870-signing-key\") pod \"service-ca-9c57cc56f-t8l8g\" (UID: \"834efefd-4b1f-45e3-9085-6c0dab5f4870\") " pod="openshift-service-ca/service-ca-9c57cc56f-t8l8g" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.123887 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/87655950-5426-4cb0-a10b-f71b0fbb0549-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-tgzvl\" (UID: \"87655950-5426-4cb0-a10b-f71b0fbb0549\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgzvl" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.127389 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b02fa3f4-70d0-4ea7-8359-5cb6611ef778-node-bootstrap-token\") pod \"machine-config-server-jsjk5\" (UID: \"b02fa3f4-70d0-4ea7-8359-5cb6611ef778\") " pod="openshift-machine-config-operator/machine-config-server-jsjk5" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.131793 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b02fa3f4-70d0-4ea7-8359-5cb6611ef778-certs\") pod \"machine-config-server-jsjk5\" (UID: \"b02fa3f4-70d0-4ea7-8359-5cb6611ef778\") " pod="openshift-machine-config-operator/machine-config-server-jsjk5" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.133002 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.172400 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.192148 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.211206 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.231858 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.252223 4848 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.274118 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.291110 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.337347 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpt56\" (UniqueName: \"kubernetes.io/projected/a32bd13d-f885-4653-8868-89fc4a8ac111-kube-api-access-qpt56\") pod \"cluster-samples-operator-665b6dd947-ph5wk\" (UID: \"a32bd13d-f885-4653-8868-89fc4a8ac111\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ph5wk" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.354508 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jlj2\" (UniqueName: \"kubernetes.io/projected/981371a6-bce6-4fc3-a2ea-2f4c0e26072c-kube-api-access-6jlj2\") pod \"openshift-apiserver-operator-796bbdcf4f-xtkxw\" (UID: \"981371a6-bce6-4fc3-a2ea-2f4c0e26072c\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtkxw" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.393641 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tsjg\" (UniqueName: \"kubernetes.io/projected/1d8cdbb3-b672-4984-8d03-562965a7b081-kube-api-access-5tsjg\") pod \"oauth-openshift-558db77b4-xzdww\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.410823 4848 request.go:700] Waited for 1.926652133s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console/serviceaccounts/console/token Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.423147 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcbb8\" (UniqueName: \"kubernetes.io/projected/2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a-kube-api-access-vcbb8\") pod \"route-controller-manager-6576b87f9c-lwfvg\" (UID: \"2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.429667 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw6st\" (UniqueName: \"kubernetes.io/projected/c546b6c9-bd0c-4a50-9521-49af89f3ede1-kube-api-access-sw6st\") pod \"openshift-config-operator-7777fb866f-zhzdb\" (UID: \"c546b6c9-bd0c-4a50-9521-49af89f3ede1\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhzdb" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.446504 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zzg7\" (UniqueName: \"kubernetes.io/projected/a9b13597-8879-40d4-965b-580222915295-kube-api-access-7zzg7\") pod \"console-f9d7485db-9h2hf\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.461215 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t97g5\" (UniqueName: \"kubernetes.io/projected/575767dd-6121-4745-aae9-c5434aee72d5-kube-api-access-t97g5\") pod \"downloads-7954f5f757-7vmx8\" (UID: \"575767dd-6121-4745-aae9-c5434aee72d5\") " pod="openshift-console/downloads-7954f5f757-7vmx8" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.461605 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.467059 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvdxj\" (UniqueName: \"kubernetes.io/projected/79fb293c-379a-47ac-a9b1-23746cb0758e-kube-api-access-kvdxj\") pod \"apiserver-7bbb656c7d-zw6lz\" (UID: \"79fb293c-379a-47ac-a9b1-23746cb0758e\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.489016 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwwpg\" (UniqueName: \"kubernetes.io/projected/4134a970-5107-41d2-8fbe-336387a17b77-kube-api-access-zwwpg\") pod \"cluster-image-registry-operator-dc59b4c8b-bmjgh\" (UID: \"4134a970-5107-41d2-8fbe-336387a17b77\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bmjgh" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.494179 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.510170 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.511967 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65cgg\" (UniqueName: \"kubernetes.io/projected/414280b4-3299-4fee-a33c-a231b66000c7-kube-api-access-65cgg\") pod \"router-default-5444994796-gxh7z\" (UID: \"414280b4-3299-4fee-a33c-a231b66000c7\") " pod="openshift-ingress/router-default-5444994796-gxh7z" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.534012 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.536198 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcfrt\" (UniqueName: \"kubernetes.io/projected/6b37aedc-f307-4f09-899e-db5b01f89c92-kube-api-access-lcfrt\") pod \"machine-approver-56656f9798-2b8xg\" (UID: \"6b37aedc-f307-4f09-899e-db5b01f89c92\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2b8xg" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.544115 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-7vmx8" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.548199 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ph5wk" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.556290 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q77d\" (UniqueName: \"kubernetes.io/projected/15500d79-d7c7-4a0c-965d-4783f4a85b2c-kube-api-access-7q77d\") pod \"migrator-59844c95c7-wks4t\" (UID: \"15500d79-d7c7-4a0c-965d-4783f4a85b2c\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wks4t" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.558969 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtkxw" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.570590 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhzdb" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.577789 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rkfw\" (UniqueName: \"kubernetes.io/projected/b1482be7-a50a-43ca-974a-d49cad628e46-kube-api-access-9rkfw\") pod \"apiserver-76f77b778f-b8gqs\" (UID: \"b1482be7-a50a-43ca-974a-d49cad628e46\") " pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.603052 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrpcf\" (UniqueName: \"kubernetes.io/projected/038c87fd-172d-4e34-8927-92bafb47879a-kube-api-access-vrpcf\") pod \"openshift-controller-manager-operator-756b6f6bc6-7b6g5\" (UID: \"038c87fd-172d-4e34-8927-92bafb47879a\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7b6g5" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.629681 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vscww\" (UniqueName: \"kubernetes.io/projected/f7b4b17a-ac26-4e3a-8f67-d0c855b6aa95-kube-api-access-vscww\") pod \"control-plane-machine-set-operator-78cbb6b69f-4cw7d\" (UID: \"f7b4b17a-ac26-4e3a-8f67-d0c855b6aa95\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4cw7d" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.635299 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7b6g5" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.635625 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9f43c1a5-4813-44e6-b2b9-53b134283a59-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-vggbp\" (UID: \"9f43c1a5-4813-44e6-b2b9-53b134283a59\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vggbp" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.660951 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7lwn\" (UniqueName: \"kubernetes.io/projected/1ebf9d1e-e313-440d-992a-9e0ede5b2b24-kube-api-access-s7lwn\") pod \"machine-api-operator-5694c8668f-fqhrm\" (UID: \"1ebf9d1e-e313-440d-992a-9e0ede5b2b24\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-fqhrm" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.672983 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2b8xg" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.673686 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqns5\" (UniqueName: \"kubernetes.io/projected/71638eb4-fb1c-42be-84d6-d900ad27f196-kube-api-access-kqns5\") pod \"authentication-operator-69f744f599-9zhhp\" (UID: \"71638eb4-fb1c-42be-84d6-d900ad27f196\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-9zhhp" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.674483 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-gxh7z" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.684037 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vggbp" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.706973 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htkx7\" (UniqueName: \"kubernetes.io/projected/7f794184-a546-44a9-ab3b-47d69b306384-kube-api-access-htkx7\") pod \"catalog-operator-68c6474976-t68jc\" (UID: \"7f794184-a546-44a9-ab3b-47d69b306384\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t68jc" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.708555 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4cw7d" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.711328 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtt5h\" (UniqueName: \"kubernetes.io/projected/4baeac0b-e75b-49b5-9206-4cd99a4764f6-kube-api-access-jtt5h\") pod \"console-operator-58897d9998-48d4r\" (UID: \"4baeac0b-e75b-49b5-9206-4cd99a4764f6\") " pod="openshift-console-operator/console-operator-58897d9998-48d4r" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.728684 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg"] Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.729135 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4134a970-5107-41d2-8fbe-336387a17b77-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-bmjgh\" (UID: \"4134a970-5107-41d2-8fbe-336387a17b77\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bmjgh" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.734020 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-fqhrm" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.751160 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgbwd\" (UniqueName: \"kubernetes.io/projected/b0fba000-75c8-49be-945e-fc41fabf370c-kube-api-access-fgbwd\") pod \"controller-manager-879f6c89f-6xbj4\" (UID: \"b0fba000-75c8-49be-945e-fc41fabf370c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.780035 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-9zhhp" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.787807 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.798143 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcq6f\" (UniqueName: \"kubernetes.io/projected/7771668e-6442-46eb-a2a5-53fd35396ef4-kube-api-access-gcq6f\") pod \"machine-config-controller-84d6567774-4cgbp\" (UID: \"7771668e-6442-46eb-a2a5-53fd35396ef4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4cgbp" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.810880 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkpmt\" (UniqueName: \"kubernetes.io/projected/87655950-5426-4cb0-a10b-f71b0fbb0549-kube-api-access-pkpmt\") pod \"kube-storage-version-migrator-operator-b67b599dd-tgzvl\" (UID: \"87655950-5426-4cb0-a10b-f71b0fbb0549\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgzvl" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.817472 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgzvl" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.827540 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wks4t" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.830929 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcxwv\" (UniqueName: \"kubernetes.io/projected/f4d0b3bb-c027-4390-92ac-66aad8bf0d19-kube-api-access-gcxwv\") pod \"collect-profiles-29521980-llz2m\" (UID: \"f4d0b3bb-c027-4390-92ac-66aad8bf0d19\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521980-llz2m" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.836040 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521980-llz2m" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.854413 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plmfj\" (UniqueName: \"kubernetes.io/projected/834efefd-4b1f-45e3-9085-6c0dab5f4870-kube-api-access-plmfj\") pod \"service-ca-9c57cc56f-t8l8g\" (UID: \"834efefd-4b1f-45e3-9085-6c0dab5f4870\") " pod="openshift-service-ca/service-ca-9c57cc56f-t8l8g" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.862293 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-t8l8g" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.875626 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a3d51261-dd68-44ef-9329-fdb8e325d504-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-47hsw\" (UID: \"a3d51261-dd68-44ef-9329-fdb8e325d504\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-47hsw" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.878581 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bmjgh" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.890221 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8v62\" (UniqueName: \"kubernetes.io/projected/d5ece5ca-10cc-4597-b14e-60eb9091c7ff-kube-api-access-f8v62\") pod \"etcd-operator-b45778765-6rbxt\" (UID: \"d5ece5ca-10cc-4597-b14e-60eb9091c7ff\") " pod="openshift-etcd-operator/etcd-operator-b45778765-6rbxt" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.906048 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cd27b6bd-2fc3-4fb3-8187-d08fccb41c82-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wlh58\" (UID: \"cd27b6bd-2fc3-4fb3-8187-d08fccb41c82\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wlh58" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.931634 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wq94s\" (UniqueName: \"kubernetes.io/projected/3afb3ce4-f468-4042-b4d5-61285893e7e1-kube-api-access-wq94s\") pod \"marketplace-operator-79b997595-mq695\" (UID: \"3afb3ce4-f468-4042-b4d5-61285893e7e1\") " pod="openshift-marketplace/marketplace-operator-79b997595-mq695" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.943715 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-48d4r" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.952029 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8vl6\" (UniqueName: \"kubernetes.io/projected/73e08276-9d2b-4ff7-a484-f19bfc28a9ec-kube-api-access-x8vl6\") pod \"dns-operator-744455d44c-c7jxs\" (UID: \"73e08276-9d2b-4ff7-a484-f19bfc28a9ec\") " pod="openshift-dns-operator/dns-operator-744455d44c-c7jxs" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.953768 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t68jc" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.968477 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.972370 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzpfb\" (UniqueName: \"kubernetes.io/projected/6e53ded1-38a3-4129-bdfb-e0ef6ca1b748-kube-api-access-tzpfb\") pod \"multus-admission-controller-857f4d67dd-q62cm\" (UID: \"6e53ded1-38a3-4129-bdfb-e0ef6ca1b748\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-q62cm" Feb 17 09:07:44 crc kubenswrapper[4848]: I0217 09:07:44.990594 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztfzt\" (UniqueName: \"kubernetes.io/projected/b26097da-27bf-41ed-9010-8a967d0bc173-kube-api-access-ztfzt\") pod \"service-ca-operator-777779d784-86pgj\" (UID: \"b26097da-27bf-41ed-9010-8a967d0bc173\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-86pgj" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.012883 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9gx9\" (UniqueName: \"kubernetes.io/projected/dd116ca8-9aec-4817-a7b2-a242e01a0a2e-kube-api-access-p9gx9\") pod \"machine-config-operator-74547568cd-xdqnq\" (UID: \"dd116ca8-9aec-4817-a7b2-a242e01a0a2e\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xdqnq" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.017416 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4cgbp" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.027709 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dfvz\" (UniqueName: \"kubernetes.io/projected/b02fa3f4-70d0-4ea7-8359-5cb6611ef778-kube-api-access-5dfvz\") pod \"machine-config-server-jsjk5\" (UID: \"b02fa3f4-70d0-4ea7-8359-5cb6611ef778\") " pod="openshift-machine-config-operator/machine-config-server-jsjk5" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.035481 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-c7jxs" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.040365 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-47hsw" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.053027 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-7vmx8"] Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.054132 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wlh58" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.070321 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-6rbxt" Feb 17 09:07:45 crc kubenswrapper[4848]: W0217 09:07:45.071578 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod575767dd_6121_4745_aae9_c5434aee72d5.slice/crio-f269cc25872a2a5ea66d04313ff591abc237bd763e8b375678f4760ee9347f7c WatchSource:0}: Error finding container f269cc25872a2a5ea66d04313ff591abc237bd763e8b375678f4760ee9347f7c: Status 404 returned error can't find the container with id f269cc25872a2a5ea66d04313ff591abc237bd763e8b375678f4760ee9347f7c Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.078241 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-q62cm" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.078655 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xzdww"] Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.085317 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mq695" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.099597 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz"] Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.101583 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xdqnq" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.121926 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-86pgj" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.132844 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nhsj\" (UniqueName: \"kubernetes.io/projected/e29aabb7-fe3b-4887-a00d-535144b46d4b-kube-api-access-2nhsj\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.132869 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/93141f54-9595-4160-828c-f09251e540b7-bound-sa-token\") pod \"ingress-operator-5b745b69d9-d4p2l\" (UID: \"93141f54-9595-4160-828c-f09251e540b7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d4p2l" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.132886 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e45c378d-04fe-48ff-87c9-6fcc02ede1b9-apiservice-cert\") pod \"packageserver-d55dfcdfc-lf2lr\" (UID: \"e45c378d-04fe-48ff-87c9-6fcc02ede1b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lf2lr" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.132902 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8bcf7ab2-ac9d-4767-b1e1-24e89c8204a4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-djpdz\" (UID: \"8bcf7ab2-ac9d-4767-b1e1-24e89c8204a4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-djpdz" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.132932 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e29aabb7-fe3b-4887-a00d-535144b46d4b-bound-sa-token\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.132959 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e29aabb7-fe3b-4887-a00d-535144b46d4b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.132973 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1f199330-d302-4a0f-8b32-8a334c243125-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dx9bs\" (UID: \"1f199330-d302-4a0f-8b32-8a334c243125\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dx9bs" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.132987 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjwm8\" (UniqueName: \"kubernetes.io/projected/8bcf7ab2-ac9d-4767-b1e1-24e89c8204a4-kube-api-access-jjwm8\") pod \"package-server-manager-789f6589d5-djpdz\" (UID: \"8bcf7ab2-ac9d-4767-b1e1-24e89c8204a4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-djpdz" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.133028 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e29aabb7-fe3b-4887-a00d-535144b46d4b-registry-certificates\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.133046 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zmtw\" (UniqueName: \"kubernetes.io/projected/06e92d67-efb3-4cc4-b304-9543942bc23c-kube-api-access-7zmtw\") pod \"dns-default-7gvvt\" (UID: \"06e92d67-efb3-4cc4-b304-9543942bc23c\") " pod="openshift-dns/dns-default-7gvvt" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.133062 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06e92d67-efb3-4cc4-b304-9543942bc23c-metrics-tls\") pod \"dns-default-7gvvt\" (UID: \"06e92d67-efb3-4cc4-b304-9543942bc23c\") " pod="openshift-dns/dns-default-7gvvt" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.133084 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06e92d67-efb3-4cc4-b304-9543942bc23c-config-volume\") pod \"dns-default-7gvvt\" (UID: \"06e92d67-efb3-4cc4-b304-9543942bc23c\") " pod="openshift-dns/dns-default-7gvvt" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.133110 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/93141f54-9595-4160-828c-f09251e540b7-metrics-tls\") pod \"ingress-operator-5b745b69d9-d4p2l\" (UID: \"93141f54-9595-4160-828c-f09251e540b7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d4p2l" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.133147 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e45c378d-04fe-48ff-87c9-6fcc02ede1b9-webhook-cert\") pod \"packageserver-d55dfcdfc-lf2lr\" (UID: \"e45c378d-04fe-48ff-87c9-6fcc02ede1b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lf2lr" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.133163 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/93141f54-9595-4160-828c-f09251e540b7-trusted-ca\") pod \"ingress-operator-5b745b69d9-d4p2l\" (UID: \"93141f54-9595-4160-828c-f09251e540b7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d4p2l" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.133196 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e45c378d-04fe-48ff-87c9-6fcc02ede1b9-tmpfs\") pod \"packageserver-d55dfcdfc-lf2lr\" (UID: \"e45c378d-04fe-48ff-87c9-6fcc02ede1b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lf2lr" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.133211 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e29aabb7-fe3b-4887-a00d-535144b46d4b-registry-tls\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.133302 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.133319 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg8rc\" (UniqueName: \"kubernetes.io/projected/93141f54-9595-4160-828c-f09251e540b7-kube-api-access-hg8rc\") pod \"ingress-operator-5b745b69d9-d4p2l\" (UID: \"93141f54-9595-4160-828c-f09251e540b7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d4p2l" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.133334 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1f199330-d302-4a0f-8b32-8a334c243125-srv-cert\") pod \"olm-operator-6b444d44fb-dx9bs\" (UID: \"1f199330-d302-4a0f-8b32-8a334c243125\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dx9bs" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.133349 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e29aabb7-fe3b-4887-a00d-535144b46d4b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.133374 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klb8f\" (UniqueName: \"kubernetes.io/projected/1f199330-d302-4a0f-8b32-8a334c243125-kube-api-access-klb8f\") pod \"olm-operator-6b444d44fb-dx9bs\" (UID: \"1f199330-d302-4a0f-8b32-8a334c243125\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dx9bs" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.133440 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e29aabb7-fe3b-4887-a00d-535144b46d4b-trusted-ca\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.133466 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg5rz\" (UniqueName: \"kubernetes.io/projected/e45c378d-04fe-48ff-87c9-6fcc02ede1b9-kube-api-access-mg5rz\") pod \"packageserver-d55dfcdfc-lf2lr\" (UID: \"e45c378d-04fe-48ff-87c9-6fcc02ede1b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lf2lr" Feb 17 09:07:45 crc kubenswrapper[4848]: E0217 09:07:45.136602 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:45.636590144 +0000 UTC m=+143.179845790 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.154003 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-jsjk5" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.165120 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" event={"ID":"2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a","Type":"ContainerStarted","Data":"3db917ae5c371beef36fa9f850d541ccaa54d93d792ac38565efdc62c7cda6e8"} Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.165164 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" event={"ID":"2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a","Type":"ContainerStarted","Data":"52e2f74afbe0914ea80f4fc0b8fed78194bf90a8f3fb5c6243921f6b3025c569"} Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.166577 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.176079 4848 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-lwfvg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.176124 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" podUID="2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.181358 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gxh7z" event={"ID":"414280b4-3299-4fee-a33c-a231b66000c7","Type":"ContainerStarted","Data":"ad2a9435388fb0f62399d0666d0e5d1a1afea255607a68ab5803c5075a3447f1"} Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.181402 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-gxh7z" event={"ID":"414280b4-3299-4fee-a33c-a231b66000c7","Type":"ContainerStarted","Data":"f4b66471192e437bfde91cd3322ff3354c91e058e678ebcd0ef69100d9a24cb5"} Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.183073 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" event={"ID":"1d8cdbb3-b672-4984-8d03-562965a7b081","Type":"ContainerStarted","Data":"22d757470f1e9e3324bd6a5f6d590599abe9d7b44654447975e74220d7327564"} Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.184053 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-7vmx8" event={"ID":"575767dd-6121-4745-aae9-c5434aee72d5","Type":"ContainerStarted","Data":"f269cc25872a2a5ea66d04313ff591abc237bd763e8b375678f4760ee9347f7c"} Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.198923 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2b8xg" event={"ID":"6b37aedc-f307-4f09-899e-db5b01f89c92","Type":"ContainerStarted","Data":"acd047bbb916ad617b577ca406aa33311b108ab708105ecc5538280aa319742f"} Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.198959 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2b8xg" event={"ID":"6b37aedc-f307-4f09-899e-db5b01f89c92","Type":"ContainerStarted","Data":"e17f19d692865f760e085e1842486c34e9d86750a0aa470ecb586d1cb96a785c"} Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.235031 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.235281 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7e326ca3-7185-4e3b-a023-6ccbf1214457-socket-dir\") pod \"csi-hostpathplugin-xc9l6\" (UID: \"7e326ca3-7185-4e3b-a023-6ccbf1214457\") " pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.235357 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg8rc\" (UniqueName: \"kubernetes.io/projected/93141f54-9595-4160-828c-f09251e540b7-kube-api-access-hg8rc\") pod \"ingress-operator-5b745b69d9-d4p2l\" (UID: \"93141f54-9595-4160-828c-f09251e540b7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d4p2l" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.235406 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1f199330-d302-4a0f-8b32-8a334c243125-srv-cert\") pod \"olm-operator-6b444d44fb-dx9bs\" (UID: \"1f199330-d302-4a0f-8b32-8a334c243125\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dx9bs" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.235431 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e29aabb7-fe3b-4887-a00d-535144b46d4b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.235469 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7e326ca3-7185-4e3b-a023-6ccbf1214457-mountpoint-dir\") pod \"csi-hostpathplugin-xc9l6\" (UID: \"7e326ca3-7185-4e3b-a023-6ccbf1214457\") " pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.235490 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klb8f\" (UniqueName: \"kubernetes.io/projected/1f199330-d302-4a0f-8b32-8a334c243125-kube-api-access-klb8f\") pod \"olm-operator-6b444d44fb-dx9bs\" (UID: \"1f199330-d302-4a0f-8b32-8a334c243125\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dx9bs" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.235546 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a5ecbfe-7c94-4d35-a911-9c82ba255d04-cert\") pod \"ingress-canary-blj8p\" (UID: \"0a5ecbfe-7c94-4d35-a911-9c82ba255d04\") " pod="openshift-ingress-canary/ingress-canary-blj8p" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.235693 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7e326ca3-7185-4e3b-a023-6ccbf1214457-csi-data-dir\") pod \"csi-hostpathplugin-xc9l6\" (UID: \"7e326ca3-7185-4e3b-a023-6ccbf1214457\") " pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.236737 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7e326ca3-7185-4e3b-a023-6ccbf1214457-plugins-dir\") pod \"csi-hostpathplugin-xc9l6\" (UID: \"7e326ca3-7185-4e3b-a023-6ccbf1214457\") " pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.236829 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e29aabb7-fe3b-4887-a00d-535144b46d4b-trusted-ca\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: E0217 09:07:45.236876 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:45.736858176 +0000 UTC m=+143.280113822 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.237969 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg5rz\" (UniqueName: \"kubernetes.io/projected/e45c378d-04fe-48ff-87c9-6fcc02ede1b9-kube-api-access-mg5rz\") pod \"packageserver-d55dfcdfc-lf2lr\" (UID: \"e45c378d-04fe-48ff-87c9-6fcc02ede1b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lf2lr" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.238168 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nhsj\" (UniqueName: \"kubernetes.io/projected/e29aabb7-fe3b-4887-a00d-535144b46d4b-kube-api-access-2nhsj\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.238547 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/93141f54-9595-4160-828c-f09251e540b7-bound-sa-token\") pod \"ingress-operator-5b745b69d9-d4p2l\" (UID: \"93141f54-9595-4160-828c-f09251e540b7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d4p2l" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.238592 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e45c378d-04fe-48ff-87c9-6fcc02ede1b9-apiservice-cert\") pod \"packageserver-d55dfcdfc-lf2lr\" (UID: \"e45c378d-04fe-48ff-87c9-6fcc02ede1b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lf2lr" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.238614 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8bcf7ab2-ac9d-4767-b1e1-24e89c8204a4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-djpdz\" (UID: \"8bcf7ab2-ac9d-4767-b1e1-24e89c8204a4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-djpdz" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.238776 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e29aabb7-fe3b-4887-a00d-535144b46d4b-bound-sa-token\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.238845 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e29aabb7-fe3b-4887-a00d-535144b46d4b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.238921 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e29aabb7-fe3b-4887-a00d-535144b46d4b-trusted-ca\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.238946 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1f199330-d302-4a0f-8b32-8a334c243125-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dx9bs\" (UID: \"1f199330-d302-4a0f-8b32-8a334c243125\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dx9bs" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.239106 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjwm8\" (UniqueName: \"kubernetes.io/projected/8bcf7ab2-ac9d-4767-b1e1-24e89c8204a4-kube-api-access-jjwm8\") pod \"package-server-manager-789f6589d5-djpdz\" (UID: \"8bcf7ab2-ac9d-4767-b1e1-24e89c8204a4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-djpdz" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.239262 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e29aabb7-fe3b-4887-a00d-535144b46d4b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.239326 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7e326ca3-7185-4e3b-a023-6ccbf1214457-registration-dir\") pod \"csi-hostpathplugin-xc9l6\" (UID: \"7e326ca3-7185-4e3b-a023-6ccbf1214457\") " pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.240137 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e29aabb7-fe3b-4887-a00d-535144b46d4b-registry-certificates\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.240166 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zmtw\" (UniqueName: \"kubernetes.io/projected/06e92d67-efb3-4cc4-b304-9543942bc23c-kube-api-access-7zmtw\") pod \"dns-default-7gvvt\" (UID: \"06e92d67-efb3-4cc4-b304-9543942bc23c\") " pod="openshift-dns/dns-default-7gvvt" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.240206 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06e92d67-efb3-4cc4-b304-9543942bc23c-metrics-tls\") pod \"dns-default-7gvvt\" (UID: \"06e92d67-efb3-4cc4-b304-9543942bc23c\") " pod="openshift-dns/dns-default-7gvvt" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.240225 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06e92d67-efb3-4cc4-b304-9543942bc23c-config-volume\") pod \"dns-default-7gvvt\" (UID: \"06e92d67-efb3-4cc4-b304-9543942bc23c\") " pod="openshift-dns/dns-default-7gvvt" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.240292 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/93141f54-9595-4160-828c-f09251e540b7-metrics-tls\") pod \"ingress-operator-5b745b69d9-d4p2l\" (UID: \"93141f54-9595-4160-828c-f09251e540b7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d4p2l" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.240494 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e45c378d-04fe-48ff-87c9-6fcc02ede1b9-webhook-cert\") pod \"packageserver-d55dfcdfc-lf2lr\" (UID: \"e45c378d-04fe-48ff-87c9-6fcc02ede1b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lf2lr" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.240530 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7wmk\" (UniqueName: \"kubernetes.io/projected/7e326ca3-7185-4e3b-a023-6ccbf1214457-kube-api-access-j7wmk\") pod \"csi-hostpathplugin-xc9l6\" (UID: \"7e326ca3-7185-4e3b-a023-6ccbf1214457\") " pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.240620 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/93141f54-9595-4160-828c-f09251e540b7-trusted-ca\") pod \"ingress-operator-5b745b69d9-d4p2l\" (UID: \"93141f54-9595-4160-828c-f09251e540b7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d4p2l" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.240778 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e45c378d-04fe-48ff-87c9-6fcc02ede1b9-tmpfs\") pod \"packageserver-d55dfcdfc-lf2lr\" (UID: \"e45c378d-04fe-48ff-87c9-6fcc02ede1b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lf2lr" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.240811 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e29aabb7-fe3b-4887-a00d-535144b46d4b-registry-tls\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.240841 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5dq7\" (UniqueName: \"kubernetes.io/projected/0a5ecbfe-7c94-4d35-a911-9c82ba255d04-kube-api-access-q5dq7\") pod \"ingress-canary-blj8p\" (UID: \"0a5ecbfe-7c94-4d35-a911-9c82ba255d04\") " pod="openshift-ingress-canary/ingress-canary-blj8p" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.241360 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e29aabb7-fe3b-4887-a00d-535144b46d4b-registry-certificates\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.242649 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06e92d67-efb3-4cc4-b304-9543942bc23c-config-volume\") pod \"dns-default-7gvvt\" (UID: \"06e92d67-efb3-4cc4-b304-9543942bc23c\") " pod="openshift-dns/dns-default-7gvvt" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.243219 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/e45c378d-04fe-48ff-87c9-6fcc02ede1b9-tmpfs\") pod \"packageserver-d55dfcdfc-lf2lr\" (UID: \"e45c378d-04fe-48ff-87c9-6fcc02ede1b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lf2lr" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.243746 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/93141f54-9595-4160-828c-f09251e540b7-trusted-ca\") pod \"ingress-operator-5b745b69d9-d4p2l\" (UID: \"93141f54-9595-4160-828c-f09251e540b7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d4p2l" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.247549 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e45c378d-04fe-48ff-87c9-6fcc02ede1b9-apiservice-cert\") pod \"packageserver-d55dfcdfc-lf2lr\" (UID: \"e45c378d-04fe-48ff-87c9-6fcc02ede1b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lf2lr" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.247908 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e29aabb7-fe3b-4887-a00d-535144b46d4b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.247970 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/8bcf7ab2-ac9d-4767-b1e1-24e89c8204a4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-djpdz\" (UID: \"8bcf7ab2-ac9d-4767-b1e1-24e89c8204a4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-djpdz" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.248221 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1f199330-d302-4a0f-8b32-8a334c243125-srv-cert\") pod \"olm-operator-6b444d44fb-dx9bs\" (UID: \"1f199330-d302-4a0f-8b32-8a334c243125\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dx9bs" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.249172 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/93141f54-9595-4160-828c-f09251e540b7-metrics-tls\") pod \"ingress-operator-5b745b69d9-d4p2l\" (UID: \"93141f54-9595-4160-828c-f09251e540b7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d4p2l" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.251113 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e45c378d-04fe-48ff-87c9-6fcc02ede1b9-webhook-cert\") pod \"packageserver-d55dfcdfc-lf2lr\" (UID: \"e45c378d-04fe-48ff-87c9-6fcc02ede1b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lf2lr" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.252709 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e29aabb7-fe3b-4887-a00d-535144b46d4b-registry-tls\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.253340 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1f199330-d302-4a0f-8b32-8a334c243125-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dx9bs\" (UID: \"1f199330-d302-4a0f-8b32-8a334c243125\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dx9bs" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.255892 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/06e92d67-efb3-4cc4-b304-9543942bc23c-metrics-tls\") pod \"dns-default-7gvvt\" (UID: \"06e92d67-efb3-4cc4-b304-9543942bc23c\") " pod="openshift-dns/dns-default-7gvvt" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.341914 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7e326ca3-7185-4e3b-a023-6ccbf1214457-registration-dir\") pod \"csi-hostpathplugin-xc9l6\" (UID: \"7e326ca3-7185-4e3b-a023-6ccbf1214457\") " pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.341983 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7wmk\" (UniqueName: \"kubernetes.io/projected/7e326ca3-7185-4e3b-a023-6ccbf1214457-kube-api-access-j7wmk\") pod \"csi-hostpathplugin-xc9l6\" (UID: \"7e326ca3-7185-4e3b-a023-6ccbf1214457\") " pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.342027 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5dq7\" (UniqueName: \"kubernetes.io/projected/0a5ecbfe-7c94-4d35-a911-9c82ba255d04-kube-api-access-q5dq7\") pod \"ingress-canary-blj8p\" (UID: \"0a5ecbfe-7c94-4d35-a911-9c82ba255d04\") " pod="openshift-ingress-canary/ingress-canary-blj8p" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.342072 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7e326ca3-7185-4e3b-a023-6ccbf1214457-socket-dir\") pod \"csi-hostpathplugin-xc9l6\" (UID: \"7e326ca3-7185-4e3b-a023-6ccbf1214457\") " pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.342099 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.342167 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7e326ca3-7185-4e3b-a023-6ccbf1214457-mountpoint-dir\") pod \"csi-hostpathplugin-xc9l6\" (UID: \"7e326ca3-7185-4e3b-a023-6ccbf1214457\") " pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.342183 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a5ecbfe-7c94-4d35-a911-9c82ba255d04-cert\") pod \"ingress-canary-blj8p\" (UID: \"0a5ecbfe-7c94-4d35-a911-9c82ba255d04\") " pod="openshift-ingress-canary/ingress-canary-blj8p" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.342211 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7e326ca3-7185-4e3b-a023-6ccbf1214457-csi-data-dir\") pod \"csi-hostpathplugin-xc9l6\" (UID: \"7e326ca3-7185-4e3b-a023-6ccbf1214457\") " pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.342237 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7e326ca3-7185-4e3b-a023-6ccbf1214457-plugins-dir\") pod \"csi-hostpathplugin-xc9l6\" (UID: \"7e326ca3-7185-4e3b-a023-6ccbf1214457\") " pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.342616 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7e326ca3-7185-4e3b-a023-6ccbf1214457-mountpoint-dir\") pod \"csi-hostpathplugin-xc9l6\" (UID: \"7e326ca3-7185-4e3b-a023-6ccbf1214457\") " pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.342688 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7e326ca3-7185-4e3b-a023-6ccbf1214457-registration-dir\") pod \"csi-hostpathplugin-xc9l6\" (UID: \"7e326ca3-7185-4e3b-a023-6ccbf1214457\") " pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.342813 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7e326ca3-7185-4e3b-a023-6ccbf1214457-socket-dir\") pod \"csi-hostpathplugin-xc9l6\" (UID: \"7e326ca3-7185-4e3b-a023-6ccbf1214457\") " pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.343089 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7e326ca3-7185-4e3b-a023-6ccbf1214457-csi-data-dir\") pod \"csi-hostpathplugin-xc9l6\" (UID: \"7e326ca3-7185-4e3b-a023-6ccbf1214457\") " pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.343123 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7e326ca3-7185-4e3b-a023-6ccbf1214457-plugins-dir\") pod \"csi-hostpathplugin-xc9l6\" (UID: \"7e326ca3-7185-4e3b-a023-6ccbf1214457\") " pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" Feb 17 09:07:45 crc kubenswrapper[4848]: E0217 09:07:45.343194 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:45.843154911 +0000 UTC m=+143.386410567 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.442911 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:45 crc kubenswrapper[4848]: E0217 09:07:45.443088 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:45.943059143 +0000 UTC m=+143.486314859 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.484489 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg5rz\" (UniqueName: \"kubernetes.io/projected/e45c378d-04fe-48ff-87c9-6fcc02ede1b9-kube-api-access-mg5rz\") pod \"packageserver-d55dfcdfc-lf2lr\" (UID: \"e45c378d-04fe-48ff-87c9-6fcc02ede1b9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lf2lr" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.486082 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0a5ecbfe-7c94-4d35-a911-9c82ba255d04-cert\") pod \"ingress-canary-blj8p\" (UID: \"0a5ecbfe-7c94-4d35-a911-9c82ba255d04\") " pod="openshift-ingress-canary/ingress-canary-blj8p" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.486916 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nhsj\" (UniqueName: \"kubernetes.io/projected/e29aabb7-fe3b-4887-a00d-535144b46d4b-kube-api-access-2nhsj\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.487670 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjwm8\" (UniqueName: \"kubernetes.io/projected/8bcf7ab2-ac9d-4767-b1e1-24e89c8204a4-kube-api-access-jjwm8\") pod \"package-server-manager-789f6589d5-djpdz\" (UID: \"8bcf7ab2-ac9d-4767-b1e1-24e89c8204a4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-djpdz" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.488093 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/93141f54-9595-4160-828c-f09251e540b7-bound-sa-token\") pod \"ingress-operator-5b745b69d9-d4p2l\" (UID: \"93141f54-9595-4160-828c-f09251e540b7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d4p2l" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.489584 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zmtw\" (UniqueName: \"kubernetes.io/projected/06e92d67-efb3-4cc4-b304-9543942bc23c-kube-api-access-7zmtw\") pod \"dns-default-7gvvt\" (UID: \"06e92d67-efb3-4cc4-b304-9543942bc23c\") " pod="openshift-dns/dns-default-7gvvt" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.489848 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7wmk\" (UniqueName: \"kubernetes.io/projected/7e326ca3-7185-4e3b-a023-6ccbf1214457-kube-api-access-j7wmk\") pod \"csi-hostpathplugin-xc9l6\" (UID: \"7e326ca3-7185-4e3b-a023-6ccbf1214457\") " pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.491023 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klb8f\" (UniqueName: \"kubernetes.io/projected/1f199330-d302-4a0f-8b32-8a334c243125-kube-api-access-klb8f\") pod \"olm-operator-6b444d44fb-dx9bs\" (UID: \"1f199330-d302-4a0f-8b32-8a334c243125\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dx9bs" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.492303 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg8rc\" (UniqueName: \"kubernetes.io/projected/93141f54-9595-4160-828c-f09251e540b7-kube-api-access-hg8rc\") pod \"ingress-operator-5b745b69d9-d4p2l\" (UID: \"93141f54-9595-4160-828c-f09251e540b7\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d4p2l" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.494042 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5dq7\" (UniqueName: \"kubernetes.io/projected/0a5ecbfe-7c94-4d35-a911-9c82ba255d04-kube-api-access-q5dq7\") pod \"ingress-canary-blj8p\" (UID: \"0a5ecbfe-7c94-4d35-a911-9c82ba255d04\") " pod="openshift-ingress-canary/ingress-canary-blj8p" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.494180 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.495976 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e29aabb7-fe3b-4887-a00d-535144b46d4b-bound-sa-token\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.544657 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: E0217 09:07:45.545090 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:46.045074983 +0000 UTC m=+143.588330639 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.646370 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:45 crc kubenswrapper[4848]: E0217 09:07:45.646572 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:46.146541547 +0000 UTC m=+143.689797193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.646744 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.646973 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d4p2l" Feb 17 09:07:45 crc kubenswrapper[4848]: E0217 09:07:45.647095 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:46.147077052 +0000 UTC m=+143.690332698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.677407 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-gxh7z" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.695333 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dx9bs" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.708680 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-djpdz" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.730816 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lf2lr" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.746383 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7gvvt" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.747345 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:45 crc kubenswrapper[4848]: E0217 09:07:45.747449 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:46.247425156 +0000 UTC m=+143.790680802 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.747562 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: E0217 09:07:45.747938 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:46.247919139 +0000 UTC m=+143.791174875 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.757315 4848 patch_prober.go:28] interesting pod/router-default-5444994796-gxh7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 09:07:45 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Feb 17 09:07:45 crc kubenswrapper[4848]: [+]process-running ok Feb 17 09:07:45 crc kubenswrapper[4848]: healthz check failed Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.757389 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gxh7z" podUID="414280b4-3299-4fee-a33c-a231b66000c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.767877 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-blj8p" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.848814 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:45 crc kubenswrapper[4848]: E0217 09:07:45.849228 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:46.349195149 +0000 UTC m=+143.892450795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.888322 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zhzdb"] Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.897539 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-9h2hf"] Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.913191 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7b6g5"] Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.916359 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-gxh7z" podStartSLOduration=118.916343485 podStartE2EDuration="1m58.916343485s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:45.915820761 +0000 UTC m=+143.459076427" watchObservedRunningTime="2026-02-17 09:07:45.916343485 +0000 UTC m=+143.459599131" Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.941815 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ph5wk"] Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.948095 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vggbp"] Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.948132 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtkxw"] Feb 17 09:07:45 crc kubenswrapper[4848]: I0217 09:07:45.949994 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:45 crc kubenswrapper[4848]: E0217 09:07:45.950269 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:46.450257492 +0000 UTC m=+143.993513138 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:46 crc kubenswrapper[4848]: W0217 09:07:46.043691 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod038c87fd_172d_4e34_8927_92bafb47879a.slice/crio-fa68abc0f36928495633f1a9a3282fe0d822a83b5ce92d8697ab42be7297a54f WatchSource:0}: Error finding container fa68abc0f36928495633f1a9a3282fe0d822a83b5ce92d8697ab42be7297a54f: Status 404 returned error can't find the container with id fa68abc0f36928495633f1a9a3282fe0d822a83b5ce92d8697ab42be7297a54f Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.050600 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:46 crc kubenswrapper[4848]: E0217 09:07:46.050842 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:46.550810002 +0000 UTC m=+144.094065648 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.050919 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:46 crc kubenswrapper[4848]: E0217 09:07:46.051225 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:46.551219233 +0000 UTC m=+144.094474879 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.151610 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:46 crc kubenswrapper[4848]: E0217 09:07:46.151836 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:46.651817494 +0000 UTC m=+144.195073140 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.152201 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:46 crc kubenswrapper[4848]: E0217 09:07:46.152781 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:46.652750149 +0000 UTC m=+144.196005795 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.203105 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9h2hf" event={"ID":"a9b13597-8879-40d4-965b-580222915295","Type":"ContainerStarted","Data":"6c65c190555b1c38fa194ca3c16b1f1f6758c38f55327734a6bec84c9e3ca899"} Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.208927 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-jsjk5" event={"ID":"b02fa3f4-70d0-4ea7-8359-5cb6611ef778","Type":"ContainerStarted","Data":"9c2597ed1038d0ff4f26fc8e324a27c8d9d3287112d30c755677f1530ededf1e"} Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.208971 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-jsjk5" event={"ID":"b02fa3f4-70d0-4ea7-8359-5cb6611ef778","Type":"ContainerStarted","Data":"d656f7c55e862ec7634c47618a3675483363092c101c66fa2f7535426879e5c9"} Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.215999 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" event={"ID":"1d8cdbb3-b672-4984-8d03-562965a7b081","Type":"ContainerStarted","Data":"02c86f1238b5420acb6bc6a0beaaa954c362725e9545d50692e42c77b68351b0"} Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.216156 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.217558 4848 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-xzdww container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.217593 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" podUID="1d8cdbb3-b672-4984-8d03-562965a7b081" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.219825 4848 generic.go:334] "Generic (PLEG): container finished" podID="79fb293c-379a-47ac-a9b1-23746cb0758e" containerID="e406b5404136490259071245a9157c92ee165baeba1c59cea993df3dd651473f" exitCode=0 Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.219994 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" event={"ID":"79fb293c-379a-47ac-a9b1-23746cb0758e","Type":"ContainerDied","Data":"e406b5404136490259071245a9157c92ee165baeba1c59cea993df3dd651473f"} Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.220021 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" event={"ID":"79fb293c-379a-47ac-a9b1-23746cb0758e","Type":"ContainerStarted","Data":"1f98d07c8ff53f6b9e557deaf3a551172af54bc2998461fb7ed669c8e51c3864"} Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.226948 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ph5wk" event={"ID":"a32bd13d-f885-4653-8868-89fc4a8ac111","Type":"ContainerStarted","Data":"a8663e667d34c0b8f43e3146700c1fc4b69f84ead2cfef2174bd5ac40137f90d"} Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.228797 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhzdb" event={"ID":"c546b6c9-bd0c-4a50-9521-49af89f3ede1","Type":"ContainerStarted","Data":"71dbf5c94922b07e7245dce54992cb1fcb1df19dfdc8982585e346d3de929090"} Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.230251 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2b8xg" event={"ID":"6b37aedc-f307-4f09-899e-db5b01f89c92","Type":"ContainerStarted","Data":"0b88acf433ee7f0e51542bad44402bd9d96a26526076d0ae1ed535415eace9c3"} Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.231058 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtkxw" event={"ID":"981371a6-bce6-4fc3-a2ea-2f4c0e26072c","Type":"ContainerStarted","Data":"a04d32a32e47810ce7e53631055adcab72030ac6c5672a35ac7e1d2753e8bc0c"} Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.232771 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vggbp" event={"ID":"9f43c1a5-4813-44e6-b2b9-53b134283a59","Type":"ContainerStarted","Data":"0f64c88a8abfe612e3873b6a30dceb91c40d90a0232a5d7facea7e1ef53e5a40"} Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.234164 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7b6g5" event={"ID":"038c87fd-172d-4e34-8927-92bafb47879a","Type":"ContainerStarted","Data":"fa68abc0f36928495633f1a9a3282fe0d822a83b5ce92d8697ab42be7297a54f"} Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.236578 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-7vmx8" event={"ID":"575767dd-6121-4745-aae9-c5434aee72d5","Type":"ContainerStarted","Data":"94fb868e1b5349877f8c886c9791517d9e47085d99e98534df4f3a99ec1d0e90"} Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.237162 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-7vmx8" Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.238528 4848 patch_prober.go:28] interesting pod/downloads-7954f5f757-7vmx8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.238576 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7vmx8" podUID="575767dd-6121-4745-aae9-c5434aee72d5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.240873 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.252722 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:46 crc kubenswrapper[4848]: E0217 09:07:46.253230 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:46.753208266 +0000 UTC m=+144.296463912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.323690 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-9zhhp"] Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.328950 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-fqhrm"] Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.354302 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:46 crc kubenswrapper[4848]: E0217 09:07:46.355295 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:46.855281067 +0000 UTC m=+144.398536703 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:46 crc kubenswrapper[4848]: W0217 09:07:46.381425 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ebf9d1e_e313_440d_992a_9e0ede5b2b24.slice/crio-4eed20d8b3187dfad84a1a1c3ab76f9f782c140ec17b107cb0653cb7c8d95260 WatchSource:0}: Error finding container 4eed20d8b3187dfad84a1a1c3ab76f9f782c140ec17b107cb0653cb7c8d95260: Status 404 returned error can't find the container with id 4eed20d8b3187dfad84a1a1c3ab76f9f782c140ec17b107cb0653cb7c8d95260 Feb 17 09:07:46 crc kubenswrapper[4848]: W0217 09:07:46.382980 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71638eb4_fb1c_42be_84d6_d900ad27f196.slice/crio-c895723379992993c496b317cb8874642651986a8ef5d2a2909502f4e6324c37 WatchSource:0}: Error finding container c895723379992993c496b317cb8874642651986a8ef5d2a2909502f4e6324c37: Status 404 returned error can't find the container with id c895723379992993c496b317cb8874642651986a8ef5d2a2909502f4e6324c37 Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.430129 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" podStartSLOduration=119.430114414 podStartE2EDuration="1m59.430114414s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:46.428280363 +0000 UTC m=+143.971536009" watchObservedRunningTime="2026-02-17 09:07:46.430114414 +0000 UTC m=+143.973370060" Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.456308 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:46 crc kubenswrapper[4848]: E0217 09:07:46.456691 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:46.95666776 +0000 UTC m=+144.499923406 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.558035 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:46 crc kubenswrapper[4848]: E0217 09:07:46.558410 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:47.058398281 +0000 UTC m=+144.601653927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.600578 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-4cgbp"] Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.613304 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-c7jxs"] Feb 17 09:07:46 crc kubenswrapper[4848]: W0217 09:07:46.613843 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7771668e_6442_46eb_a2a5_53fd35396ef4.slice/crio-dcfc77cabe5b9067acdc351c884c2154fc8373a788199f8ff4ae5d9e73d1c3a7 WatchSource:0}: Error finding container dcfc77cabe5b9067acdc351c884c2154fc8373a788199f8ff4ae5d9e73d1c3a7: Status 404 returned error can't find the container with id dcfc77cabe5b9067acdc351c884c2154fc8373a788199f8ff4ae5d9e73d1c3a7 Feb 17 09:07:46 crc kubenswrapper[4848]: W0217 09:07:46.635198 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73e08276_9d2b_4ff7_a484_f19bfc28a9ec.slice/crio-64cac3f076f1f969d8377033bfb1b2919d7372a05aa2826022394764962c3961 WatchSource:0}: Error finding container 64cac3f076f1f969d8377033bfb1b2919d7372a05aa2826022394764962c3961: Status 404 returned error can't find the container with id 64cac3f076f1f969d8377033bfb1b2919d7372a05aa2826022394764962c3961 Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.658832 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:46 crc kubenswrapper[4848]: E0217 09:07:46.659102 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:47.159086425 +0000 UTC m=+144.702342071 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.662571 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-t8l8g"] Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.666020 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-wks4t"] Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.677593 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-48d4r"] Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.680372 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-b8gqs"] Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.681342 4848 patch_prober.go:28] interesting pod/router-default-5444994796-gxh7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 09:07:46 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Feb 17 09:07:46 crc kubenswrapper[4848]: [+]process-running ok Feb 17 09:07:46 crc kubenswrapper[4848]: healthz check failed Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.681400 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gxh7z" podUID="414280b4-3299-4fee-a33c-a231b66000c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.683037 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4cw7d"] Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.684543 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521980-llz2m"] Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.688153 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgzvl"] Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.691263 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-47hsw"] Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.696826 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t68jc"] Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.698306 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bmjgh"] Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.760172 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:46 crc kubenswrapper[4848]: E0217 09:07:46.760578 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:47.26056616 +0000 UTC m=+144.803821806 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.796918 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-6rbxt"] Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.799181 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wlh58"] Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.804911 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-q62cm"] Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.811431 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-xdqnq"] Feb 17 09:07:46 crc kubenswrapper[4848]: W0217 09:07:46.819425 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e53ded1_38a3_4129_bdfb_e0ef6ca1b748.slice/crio-f49d7aa573b0c50841749c475e98864f79111fedd37504c610ad15f3015d78b4 WatchSource:0}: Error finding container f49d7aa573b0c50841749c475e98864f79111fedd37504c610ad15f3015d78b4: Status 404 returned error can't find the container with id f49d7aa573b0c50841749c475e98864f79111fedd37504c610ad15f3015d78b4 Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.829872 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6xbj4"] Feb 17 09:07:46 crc kubenswrapper[4848]: W0217 09:07:46.843136 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5ece5ca_10cc_4597_b14e_60eb9091c7ff.slice/crio-e59b11847b77b8b5224e09ba00e73b8faddcfd9867518d3d16b3385025da01d6 WatchSource:0}: Error finding container e59b11847b77b8b5224e09ba00e73b8faddcfd9867518d3d16b3385025da01d6: Status 404 returned error can't find the container with id e59b11847b77b8b5224e09ba00e73b8faddcfd9867518d3d16b3385025da01d6 Feb 17 09:07:46 crc kubenswrapper[4848]: W0217 09:07:46.851022 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0fba000_75c8_49be_945e_fc41fabf370c.slice/crio-6841ae4e6017a1b7f9d48345986b89fd5be59f5ce2811b73e8235225e1b3c13c WatchSource:0}: Error finding container 6841ae4e6017a1b7f9d48345986b89fd5be59f5ce2811b73e8235225e1b3c13c: Status 404 returned error can't find the container with id 6841ae4e6017a1b7f9d48345986b89fd5be59f5ce2811b73e8235225e1b3c13c Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.862224 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:46 crc kubenswrapper[4848]: E0217 09:07:46.862789 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:47.362752514 +0000 UTC m=+144.906008160 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.969560 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mq695"] Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.971570 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dx9bs"] Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.976181 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:46 crc kubenswrapper[4848]: E0217 09:07:46.977081 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:47.47706346 +0000 UTC m=+145.020319106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.977520 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-xc9l6"] Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.979734 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-7vmx8" podStartSLOduration=119.979715162 podStartE2EDuration="1m59.979715162s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:46.947370508 +0000 UTC m=+144.490626144" watchObservedRunningTime="2026-02-17 09:07:46.979715162 +0000 UTC m=+144.522970808" Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.985192 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-86pgj"] Feb 17 09:07:46 crc kubenswrapper[4848]: I0217 09:07:46.989603 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-d4p2l"] Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.004117 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-djpdz"] Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.006364 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" podStartSLOduration=120.00634647 podStartE2EDuration="2m0.00634647s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:46.986335343 +0000 UTC m=+144.529591019" watchObservedRunningTime="2026-02-17 09:07:47.00634647 +0000 UTC m=+144.549602116" Feb 17 09:07:47 crc kubenswrapper[4848]: W0217 09:07:47.025831 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93141f54_9595_4160_828c_f09251e540b7.slice/crio-d792dfb65e1cdbebfad60469891de258ec408cc959ec6939de339e88381bbbd9 WatchSource:0}: Error finding container d792dfb65e1cdbebfad60469891de258ec408cc959ec6939de339e88381bbbd9: Status 404 returned error can't find the container with id d792dfb65e1cdbebfad60469891de258ec408cc959ec6939de339e88381bbbd9 Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.036833 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-2b8xg" podStartSLOduration=120.036818724 podStartE2EDuration="2m0.036818724s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:47.024084795 +0000 UTC m=+144.567340441" watchObservedRunningTime="2026-02-17 09:07:47.036818724 +0000 UTC m=+144.580074370" Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.037288 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7gvvt"] Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.078863 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:47 crc kubenswrapper[4848]: E0217 09:07:47.079187 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:47.579169962 +0000 UTC m=+145.122425608 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.108536 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lf2lr"] Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.128348 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-jsjk5" podStartSLOduration=5.128329296 podStartE2EDuration="5.128329296s" podCreationTimestamp="2026-02-17 09:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:47.125109658 +0000 UTC m=+144.668365304" watchObservedRunningTime="2026-02-17 09:07:47.128329296 +0000 UTC m=+144.671584942" Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.160884 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-blj8p"] Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.190774 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:47 crc kubenswrapper[4848]: E0217 09:07:47.191133 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:47.691120063 +0000 UTC m=+145.234375709 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:47 crc kubenswrapper[4848]: W0217 09:07:47.191485 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06e92d67_efb3_4cc4_b304_9543942bc23c.slice/crio-ab4d569798a4ca4e5795749b0fb63db1b5a8d65ba8c9d7410c2a84ce5d12bd81 WatchSource:0}: Error finding container ab4d569798a4ca4e5795749b0fb63db1b5a8d65ba8c9d7410c2a84ce5d12bd81: Status 404 returned error can't find the container with id ab4d569798a4ca4e5795749b0fb63db1b5a8d65ba8c9d7410c2a84ce5d12bd81 Feb 17 09:07:47 crc kubenswrapper[4848]: W0217 09:07:47.230802 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a5ecbfe_7c94_4d35_a911_9c82ba255d04.slice/crio-368a3ce52ce66552ec14e65e959a0fcf67238cb9ead3edcf2c9d3e71663fe13a WatchSource:0}: Error finding container 368a3ce52ce66552ec14e65e959a0fcf67238cb9ead3edcf2c9d3e71663fe13a: Status 404 returned error can't find the container with id 368a3ce52ce66552ec14e65e959a0fcf67238cb9ead3edcf2c9d3e71663fe13a Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.251069 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-t8l8g" event={"ID":"834efefd-4b1f-45e3-9085-6c0dab5f4870","Type":"ContainerStarted","Data":"df7896bb5fe3cd7b46cf17fb788f0a4ff1270e72589e69ab1db42c9165620017"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.254060 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7b6g5" event={"ID":"038c87fd-172d-4e34-8927-92bafb47879a","Type":"ContainerStarted","Data":"7bff416eff5b6615a8c78608d3958e99e90741fc658ff5b1ebbc873f26cc2db4"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.257155 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-86pgj" event={"ID":"b26097da-27bf-41ed-9010-8a967d0bc173","Type":"ContainerStarted","Data":"cf7705c27bf9972a6c01a1c45b533f6b27cee46599f68193679045124df98091"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.259395 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bmjgh" event={"ID":"4134a970-5107-41d2-8fbe-336387a17b77","Type":"ContainerStarted","Data":"f978e191705b2547b26de06540e7242248d1380e66f3533e63d75078dbcc5c7d"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.259430 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bmjgh" event={"ID":"4134a970-5107-41d2-8fbe-336387a17b77","Type":"ContainerStarted","Data":"c7e93bb6ba7cf1b5b688c9f13f81d3727b1ef9ac923b9acdf42832f88d1994b1"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.263142 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ph5wk" event={"ID":"a32bd13d-f885-4653-8868-89fc4a8ac111","Type":"ContainerStarted","Data":"d5f302b74e8ba509361ac5298c13d5936d9146e647f368be82294281081ad033"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.263258 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ph5wk" event={"ID":"a32bd13d-f885-4653-8868-89fc4a8ac111","Type":"ContainerStarted","Data":"5fd3e20886abd56c29db6d5d7e1e8921bad2021a0fd15a857e3dad74051856f3"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.266965 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" event={"ID":"7e326ca3-7185-4e3b-a023-6ccbf1214457","Type":"ContainerStarted","Data":"33bed65593cd02de82f0f8ac66728068c09bb55f3dc5dadd9443d0510834c9ac"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.269376 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4cw7d" event={"ID":"f7b4b17a-ac26-4e3a-8f67-d0c855b6aa95","Type":"ContainerStarted","Data":"8442fcca32b0ac77ee216cad7a7b53e77779349be11a397b0fe61b575dcf3dc9"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.269470 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4cw7d" event={"ID":"f7b4b17a-ac26-4e3a-8f67-d0c855b6aa95","Type":"ContainerStarted","Data":"1edefda8d69f7ff035dcd7b6c99dd3f31a390aedba1492621c6f748412b1d627"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.274727 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-djpdz" event={"ID":"8bcf7ab2-ac9d-4767-b1e1-24e89c8204a4","Type":"ContainerStarted","Data":"b27b8e94a33a624c41912e9b5ae2716c41c2236e87200485c3a7a144ecb2706d"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.274993 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-7b6g5" podStartSLOduration=120.274981186 podStartE2EDuration="2m0.274981186s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:47.27331542 +0000 UTC m=+144.816571076" watchObservedRunningTime="2026-02-17 09:07:47.274981186 +0000 UTC m=+144.818236832" Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.277087 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-48d4r" event={"ID":"4baeac0b-e75b-49b5-9206-4cd99a4764f6","Type":"ContainerStarted","Data":"a815e17eb7bc340b3d3011ec9e5598eaf126f7bb25019a7e8845fc9cf63d5fd6"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.277128 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-48d4r" event={"ID":"4baeac0b-e75b-49b5-9206-4cd99a4764f6","Type":"ContainerStarted","Data":"b40727928149e055f92ca7eb0a4ca86ae72ad976f3e606f916629a25ac4529eb"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.277979 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-48d4r" Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.281861 4848 patch_prober.go:28] interesting pod/console-operator-58897d9998-48d4r container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/readyz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.281900 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-48d4r" podUID="4baeac0b-e75b-49b5-9206-4cd99a4764f6" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/readyz\": dial tcp 10.217.0.27:8443: connect: connection refused" Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.291946 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:47 crc kubenswrapper[4848]: E0217 09:07:47.292846 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:47.792831414 +0000 UTC m=+145.336087050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.309632 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4cgbp" event={"ID":"7771668e-6442-46eb-a2a5-53fd35396ef4","Type":"ContainerStarted","Data":"006eba46e66cca7aee8a98d5f17fc18d4c2ab8916fc4ec880bc24eb2acf2991b"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.309667 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4cgbp" event={"ID":"7771668e-6442-46eb-a2a5-53fd35396ef4","Type":"ContainerStarted","Data":"dcfc77cabe5b9067acdc351c884c2154fc8373a788199f8ff4ae5d9e73d1c3a7"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.312729 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-4cw7d" podStartSLOduration=120.312715878 podStartE2EDuration="2m0.312715878s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:47.299128336 +0000 UTC m=+144.842383972" watchObservedRunningTime="2026-02-17 09:07:47.312715878 +0000 UTC m=+144.855971524" Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.326116 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-47hsw" event={"ID":"a3d51261-dd68-44ef-9329-fdb8e325d504","Type":"ContainerStarted","Data":"66e50b0b859eb90b760f1ec3aa0ce29b885fd83e737426d0e7c59e0f18f3ae7f"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.328256 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-9zhhp" event={"ID":"71638eb4-fb1c-42be-84d6-d900ad27f196","Type":"ContainerStarted","Data":"3da0cec87e0b8c06c3e3c39aa24354ba2bd7d119ed8b64293daceaba9d75163d"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.328330 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-9zhhp" event={"ID":"71638eb4-fb1c-42be-84d6-d900ad27f196","Type":"ContainerStarted","Data":"c895723379992993c496b317cb8874642651986a8ef5d2a2909502f4e6324c37"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.330157 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-bmjgh" podStartSLOduration=120.330142254 podStartE2EDuration="2m0.330142254s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:47.329326412 +0000 UTC m=+144.872582068" watchObservedRunningTime="2026-02-17 09:07:47.330142254 +0000 UTC m=+144.873397900" Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.331798 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-ph5wk" podStartSLOduration=120.331790719 podStartE2EDuration="2m0.331790719s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:47.314702432 +0000 UTC m=+144.857958078" watchObservedRunningTime="2026-02-17 09:07:47.331790719 +0000 UTC m=+144.875046365" Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.345794 4848 generic.go:334] "Generic (PLEG): container finished" podID="c546b6c9-bd0c-4a50-9521-49af89f3ede1" containerID="e589fa804713418083200aae2d66cc6d0dae7b412b9a3f99d7c223229b1ed036" exitCode=0 Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.345843 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhzdb" event={"ID":"c546b6c9-bd0c-4a50-9521-49af89f3ede1","Type":"ContainerDied","Data":"e589fa804713418083200aae2d66cc6d0dae7b412b9a3f99d7c223229b1ed036"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.349704 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-48d4r" podStartSLOduration=120.349690479 podStartE2EDuration="2m0.349690479s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:47.348878597 +0000 UTC m=+144.892134253" watchObservedRunningTime="2026-02-17 09:07:47.349690479 +0000 UTC m=+144.892946125" Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.354617 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9h2hf" event={"ID":"a9b13597-8879-40d4-965b-580222915295","Type":"ContainerStarted","Data":"8683e61423d6b16620ac250b3a8d71ad7c4397cb4fc92da46d646f88262cc57c"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.357645 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" event={"ID":"79fb293c-379a-47ac-a9b1-23746cb0758e","Type":"ContainerStarted","Data":"515f138036becb409d2152ca0125d9e0eda6a3ba4b679298edae7a79f7cf642c"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.358622 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wlh58" event={"ID":"cd27b6bd-2fc3-4fb3-8187-d08fccb41c82","Type":"ContainerStarted","Data":"4fb7e96b1f9e15266bb400523d4ea80d810e74cdba161d5bf4a8bc80188663e4"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.359274 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" event={"ID":"b1482be7-a50a-43ca-974a-d49cad628e46","Type":"ContainerStarted","Data":"6714dc743a06020ea3774527fd12944e49e13f8fd6a30ba4738735a9346d5707"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.362652 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-blj8p" event={"ID":"0a5ecbfe-7c94-4d35-a911-9c82ba255d04","Type":"ContainerStarted","Data":"368a3ce52ce66552ec14e65e959a0fcf67238cb9ead3edcf2c9d3e71663fe13a"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.364941 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lf2lr" event={"ID":"e45c378d-04fe-48ff-87c9-6fcc02ede1b9","Type":"ContainerStarted","Data":"bee9f9622fb261147b9e384aeffcec25abe45fb652fe815b36d0d7e9f1da1653"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.366196 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xdqnq" event={"ID":"dd116ca8-9aec-4817-a7b2-a242e01a0a2e","Type":"ContainerStarted","Data":"4aeb3481a7bd95a02e08ab932b00e9cc169590df582baed015daab17885a47f2"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.367396 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7gvvt" event={"ID":"06e92d67-efb3-4cc4-b304-9543942bc23c","Type":"ContainerStarted","Data":"ab4d569798a4ca4e5795749b0fb63db1b5a8d65ba8c9d7410c2a84ce5d12bd81"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.370348 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-6rbxt" event={"ID":"d5ece5ca-10cc-4597-b14e-60eb9091c7ff","Type":"ContainerStarted","Data":"e59b11847b77b8b5224e09ba00e73b8faddcfd9867518d3d16b3385025da01d6"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.372509 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mq695" event={"ID":"3afb3ce4-f468-4042-b4d5-61285893e7e1","Type":"ContainerStarted","Data":"73f058631bb87c1327ec8f52182a3b757adfb0948790bcec22ef28865325de58"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.374498 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521980-llz2m" event={"ID":"f4d0b3bb-c027-4390-92ac-66aad8bf0d19","Type":"ContainerStarted","Data":"e1825d86b0c3d5d7969f30c17dccb7437e7ab7ad20a72a76cc052bc896cef086"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.374543 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521980-llz2m" event={"ID":"f4d0b3bb-c027-4390-92ac-66aad8bf0d19","Type":"ContainerStarted","Data":"aaefafae2f3e68a45cf2101536992717b93650276ff26294ef3d5ec6c023b45f"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.376773 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wks4t" event={"ID":"15500d79-d7c7-4a0c-965d-4783f4a85b2c","Type":"ContainerStarted","Data":"e4ae5d697a2b96a9443d8ef513b6a6799c5a1185e51287032fe4f998939b8967"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.384655 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-9zhhp" podStartSLOduration=120.384640545 podStartE2EDuration="2m0.384640545s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:47.382671031 +0000 UTC m=+144.925926677" watchObservedRunningTime="2026-02-17 09:07:47.384640545 +0000 UTC m=+144.927896191" Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.393214 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:47 crc kubenswrapper[4848]: E0217 09:07:47.395339 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:47.895322797 +0000 UTC m=+145.438578553 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.414314 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtkxw" event={"ID":"981371a6-bce6-4fc3-a2ea-2f4c0e26072c","Type":"ContainerStarted","Data":"4f739f3caba7efe20663bd048f45dae30e5e3857352293b9f2c6b25fefab45e8"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.419211 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vggbp" event={"ID":"9f43c1a5-4813-44e6-b2b9-53b134283a59","Type":"ContainerStarted","Data":"66634a377e33c57f4249a2e5c15eec13dd055a5f5e9aa5a22ea55854f85042e9"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.422110 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-q62cm" event={"ID":"6e53ded1-38a3-4129-bdfb-e0ef6ca1b748","Type":"ContainerStarted","Data":"f49d7aa573b0c50841749c475e98864f79111fedd37504c610ad15f3015d78b4"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.426257 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fqhrm" event={"ID":"1ebf9d1e-e313-440d-992a-9e0ede5b2b24","Type":"ContainerStarted","Data":"6b23fafff6c9ff3f23490d1a299eb49e846c2f33bc9970d527195943df416d8f"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.426299 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fqhrm" event={"ID":"1ebf9d1e-e313-440d-992a-9e0ede5b2b24","Type":"ContainerStarted","Data":"f186950da6744619c1e9ed3db0a14913359ba609ad97831d51d9fd4d8b6686ec"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.426308 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-fqhrm" event={"ID":"1ebf9d1e-e313-440d-992a-9e0ede5b2b24","Type":"ContainerStarted","Data":"4eed20d8b3187dfad84a1a1c3ab76f9f782c140ec17b107cb0653cb7c8d95260"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.426344 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29521980-llz2m" podStartSLOduration=120.426327744 podStartE2EDuration="2m0.426327744s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:47.423467156 +0000 UTC m=+144.966722802" watchObservedRunningTime="2026-02-17 09:07:47.426327744 +0000 UTC m=+144.969583390" Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.430070 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c7jxs" event={"ID":"73e08276-9d2b-4ff7-a484-f19bfc28a9ec","Type":"ContainerStarted","Data":"922d63a54ba39d17197fecb083fafd104a414547c5b38b3014686e5dfa6c2abb"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.430115 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c7jxs" event={"ID":"73e08276-9d2b-4ff7-a484-f19bfc28a9ec","Type":"ContainerStarted","Data":"64cac3f076f1f969d8377033bfb1b2919d7372a05aa2826022394764962c3961"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.431578 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" event={"ID":"b0fba000-75c8-49be-945e-fc41fabf370c","Type":"ContainerStarted","Data":"6841ae4e6017a1b7f9d48345986b89fd5be59f5ce2811b73e8235225e1b3c13c"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.441393 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dx9bs" event={"ID":"1f199330-d302-4a0f-8b32-8a334c243125","Type":"ContainerStarted","Data":"3946b9bb97ca8906b2cf5b566b40c8aeb5de3a6f90132816c8a9411a52fd37ce"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.448810 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgzvl" event={"ID":"87655950-5426-4cb0-a10b-f71b0fbb0549","Type":"ContainerStarted","Data":"0529c37839444899c604d932174227bd4b1975b8bc53b44bab598afe0ecb8b11"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.448854 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgzvl" event={"ID":"87655950-5426-4cb0-a10b-f71b0fbb0549","Type":"ContainerStarted","Data":"bf6dcf38fde5495289b5d3cb1ff128bbd255c5f96e5ac72bde28c1e76d4f1e28"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.449003 4848 csr.go:261] certificate signing request csr-86vhd is approved, waiting to be issued Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.450585 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d4p2l" event={"ID":"93141f54-9595-4160-828c-f09251e540b7","Type":"ContainerStarted","Data":"d792dfb65e1cdbebfad60469891de258ec408cc959ec6939de339e88381bbbd9"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.454021 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t68jc" event={"ID":"7f794184-a546-44a9-ab3b-47d69b306384","Type":"ContainerStarted","Data":"d01658466c217a74423d1081d62a9bc018f448eb6a4c2bba087957964f86deac"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.454066 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t68jc" event={"ID":"7f794184-a546-44a9-ab3b-47d69b306384","Type":"ContainerStarted","Data":"6a11b364bd1cde4beb365f2bd01b21316ad816a9ef3400087b62e5c396da6544"} Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.454095 4848 csr.go:257] certificate signing request csr-86vhd is issued Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.454809 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t68jc" Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.456621 4848 patch_prober.go:28] interesting pod/downloads-7954f5f757-7vmx8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.456620 4848 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-t68jc container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.456658 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7vmx8" podUID="575767dd-6121-4745-aae9-c5434aee72d5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.456673 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t68jc" podUID="7f794184-a546-44a9-ab3b-47d69b306384" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.494329 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:47 crc kubenswrapper[4848]: E0217 09:07:47.495206 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:47.995187007 +0000 UTC m=+145.538442653 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.506441 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-9h2hf" podStartSLOduration=120.506421605 podStartE2EDuration="2m0.506421605s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:47.505221852 +0000 UTC m=+145.048477498" watchObservedRunningTime="2026-02-17 09:07:47.506421605 +0000 UTC m=+145.049677261" Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.546832 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" podStartSLOduration=120.546816969 podStartE2EDuration="2m0.546816969s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:47.545525794 +0000 UTC m=+145.088781450" watchObservedRunningTime="2026-02-17 09:07:47.546816969 +0000 UTC m=+145.090072615" Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.595918 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:47 crc kubenswrapper[4848]: E0217 09:07:47.596354 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:48.096335703 +0000 UTC m=+145.639591419 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.622689 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-fqhrm" podStartSLOduration=120.622670633 podStartE2EDuration="2m0.622670633s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:47.584747756 +0000 UTC m=+145.128003492" watchObservedRunningTime="2026-02-17 09:07:47.622670633 +0000 UTC m=+145.165926279" Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.624041 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-vggbp" podStartSLOduration=120.624033641 podStartE2EDuration="2m0.624033641s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:47.622360845 +0000 UTC m=+145.165616491" watchObservedRunningTime="2026-02-17 09:07:47.624033641 +0000 UTC m=+145.167289287" Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.661113 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-tgzvl" podStartSLOduration=120.661098114 podStartE2EDuration="2m0.661098114s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:47.659816299 +0000 UTC m=+145.203071935" watchObservedRunningTime="2026-02-17 09:07:47.661098114 +0000 UTC m=+145.204353760" Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.682906 4848 patch_prober.go:28] interesting pod/router-default-5444994796-gxh7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 09:07:47 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Feb 17 09:07:47 crc kubenswrapper[4848]: [+]process-running ok Feb 17 09:07:47 crc kubenswrapper[4848]: healthz check failed Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.682976 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gxh7z" podUID="414280b4-3299-4fee-a33c-a231b66000c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.693033 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.699296 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:47 crc kubenswrapper[4848]: E0217 09:07:47.699452 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:48.199432802 +0000 UTC m=+145.742688448 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.699529 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:47 crc kubenswrapper[4848]: E0217 09:07:47.699876 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:48.199867234 +0000 UTC m=+145.743122880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.706233 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t68jc" podStartSLOduration=120.706221118 podStartE2EDuration="2m0.706221118s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:47.704796219 +0000 UTC m=+145.248051885" watchObservedRunningTime="2026-02-17 09:07:47.706221118 +0000 UTC m=+145.249476764" Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.751621 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-xtkxw" podStartSLOduration=120.751602709 podStartE2EDuration="2m0.751602709s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:47.75127141 +0000 UTC m=+145.294527056" watchObservedRunningTime="2026-02-17 09:07:47.751602709 +0000 UTC m=+145.294858355" Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.800050 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:47 crc kubenswrapper[4848]: E0217 09:07:47.800397 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:48.300383183 +0000 UTC m=+145.843638819 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:47 crc kubenswrapper[4848]: I0217 09:07:47.902012 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:47 crc kubenswrapper[4848]: E0217 09:07:47.902356 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:48.402341411 +0000 UTC m=+145.945597077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.003519 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:48 crc kubenswrapper[4848]: E0217 09:07:48.003825 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:48.503810655 +0000 UTC m=+146.047066301 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.105956 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:48 crc kubenswrapper[4848]: E0217 09:07:48.106330 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:48.606316048 +0000 UTC m=+146.149571694 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.206744 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:48 crc kubenswrapper[4848]: E0217 09:07:48.207142 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:48.707113044 +0000 UTC m=+146.250368680 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.309414 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:48 crc kubenswrapper[4848]: E0217 09:07:48.309796 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:48.809780952 +0000 UTC m=+146.353036598 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.410985 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:48 crc kubenswrapper[4848]: E0217 09:07:48.411231 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:48.911200265 +0000 UTC m=+146.454455911 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.411599 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:48 crc kubenswrapper[4848]: E0217 09:07:48.412006 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:48.911992137 +0000 UTC m=+146.455247793 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.455693 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-17 09:02:47 +0000 UTC, rotation deadline is 2026-11-01 11:48:45.402776949 +0000 UTC Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.455731 4848 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6170h40m56.947048446s for next certificate rotation Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.470357 4848 generic.go:334] "Generic (PLEG): container finished" podID="b1482be7-a50a-43ca-974a-d49cad628e46" containerID="de573111eace6b043c46556b120ca50a5f31e8161ba207b67dd0e3a62c91e4eb" exitCode=0 Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.470523 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" event={"ID":"b1482be7-a50a-43ca-974a-d49cad628e46","Type":"ContainerDied","Data":"de573111eace6b043c46556b120ca50a5f31e8161ba207b67dd0e3a62c91e4eb"} Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.476448 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mq695" event={"ID":"3afb3ce4-f468-4042-b4d5-61285893e7e1","Type":"ContainerStarted","Data":"7c5df998e064bbb570c4d9a8c2c6574ff6e0a24bf26f1944173619b295654039"} Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.477179 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-mq695" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.482863 4848 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-mq695 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.482922 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-mq695" podUID="3afb3ce4-f468-4042-b4d5-61285893e7e1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.495086 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-t8l8g" event={"ID":"834efefd-4b1f-45e3-9085-6c0dab5f4870","Type":"ContainerStarted","Data":"903536b2426becbbea7dce7102a53e92f6e094e44de199468fbedcd552ca452e"} Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.513960 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wks4t" event={"ID":"15500d79-d7c7-4a0c-965d-4783f4a85b2c","Type":"ContainerStarted","Data":"f79cb9cda2c207bbe9309dd91299ae19fcd6a6e101ada5a110d5aa546f0ef70d"} Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.514172 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:48 crc kubenswrapper[4848]: E0217 09:07:48.514328 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:49.014312035 +0000 UTC m=+146.557567681 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.514979 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:48 crc kubenswrapper[4848]: E0217 09:07:48.515577 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:49.015556179 +0000 UTC m=+146.558811905 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.533825 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhzdb" event={"ID":"c546b6c9-bd0c-4a50-9521-49af89f3ede1","Type":"ContainerStarted","Data":"fcf3c4387083ea25272937ebf55188670dcb92245496456f7a9876873bc65df9"} Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.534823 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhzdb" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.539417 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-q62cm" event={"ID":"6e53ded1-38a3-4129-bdfb-e0ef6ca1b748","Type":"ContainerStarted","Data":"97471e2a7d4ccebcb9146bcb0e203503cddd14763829e98d07e1cb09436735bf"} Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.541062 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-mq695" podStartSLOduration=121.541034775 podStartE2EDuration="2m1.541034775s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:48.526386145 +0000 UTC m=+146.069641801" watchObservedRunningTime="2026-02-17 09:07:48.541034775 +0000 UTC m=+146.084290421" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.546105 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xdqnq" event={"ID":"dd116ca8-9aec-4817-a7b2-a242e01a0a2e","Type":"ContainerStarted","Data":"96f9ef507cbca9767745600d721b769237143ae12f06cdc68ac6cee2e41d7fbd"} Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.561976 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-t8l8g" podStartSLOduration=121.561955767 podStartE2EDuration="2m1.561955767s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:48.544641284 +0000 UTC m=+146.087896940" watchObservedRunningTime="2026-02-17 09:07:48.561955767 +0000 UTC m=+146.105211413" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.586945 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d4p2l" event={"ID":"93141f54-9595-4160-828c-f09251e540b7","Type":"ContainerStarted","Data":"65d1e99431111b3ab55ab3c69702230aacba67a7a220cc4803cef448a40aff0e"} Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.589797 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-c7jxs" event={"ID":"73e08276-9d2b-4ff7-a484-f19bfc28a9ec","Type":"ContainerStarted","Data":"b72acb77936fd4a98479dc177871cea923c4d1ace51565a7a180b60b3e5b3226"} Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.592279 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" event={"ID":"b0fba000-75c8-49be-945e-fc41fabf370c","Type":"ContainerStarted","Data":"d269676e69fa6a3838213df4271ed33dd4a8ee51bb86929f8bee3c85566e988a"} Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.592608 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.594225 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-djpdz" event={"ID":"8bcf7ab2-ac9d-4767-b1e1-24e89c8204a4","Type":"ContainerStarted","Data":"98c532eac971e835c4157db1b25d58fff83099c5ebe06b951e7d1a478739fa51"} Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.595920 4848 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6xbj4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.595955 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" podUID="b0fba000-75c8-49be-945e-fc41fabf370c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.596547 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-86pgj" event={"ID":"b26097da-27bf-41ed-9010-8a967d0bc173","Type":"ContainerStarted","Data":"42dd3c440c0f1d7277edbe3e7cec0913f3986b6cf514dc80e51fd41c662becfc"} Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.601210 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lf2lr" event={"ID":"e45c378d-04fe-48ff-87c9-6fcc02ede1b9","Type":"ContainerStarted","Data":"2d750374b277211ffd071d34315cc8786a81c20b4405069f68b51ded7cee7743"} Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.602139 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lf2lr" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.603942 4848 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-lf2lr container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:5443/healthz\": dial tcp 10.217.0.32:5443: connect: connection refused" start-of-body= Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.604070 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lf2lr" podUID="e45c378d-04fe-48ff-87c9-6fcc02ede1b9" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.32:5443/healthz\": dial tcp 10.217.0.32:5443: connect: connection refused" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.605645 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7gvvt" event={"ID":"06e92d67-efb3-4cc4-b304-9543942bc23c","Type":"ContainerStarted","Data":"575a8509fc80fe9d678b5b270449a4ebadd6eed82f234200b87699fb3da2dbce"} Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.613351 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhzdb" podStartSLOduration=121.613336483 podStartE2EDuration="2m1.613336483s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:48.580839574 +0000 UTC m=+146.124095230" watchObservedRunningTime="2026-02-17 09:07:48.613336483 +0000 UTC m=+146.156592129" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.613966 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-c7jxs" podStartSLOduration=121.61395962 podStartE2EDuration="2m1.61395962s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:48.606773333 +0000 UTC m=+146.150028989" watchObservedRunningTime="2026-02-17 09:07:48.61395962 +0000 UTC m=+146.157215266" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.617553 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:48 crc kubenswrapper[4848]: E0217 09:07:48.618777 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:49.1187435 +0000 UTC m=+146.661999156 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.626651 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dx9bs" event={"ID":"1f199330-d302-4a0f-8b32-8a334c243125","Type":"ContainerStarted","Data":"3b24f119cca9a3b36fc02e4f8b016a7bffbcc6ca70458f08b1f9482d14e9801b"} Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.626698 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dx9bs" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.630370 4848 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dx9bs container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.630413 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dx9bs" podUID="1f199330-d302-4a0f-8b32-8a334c243125" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.643202 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-47hsw" event={"ID":"a3d51261-dd68-44ef-9329-fdb8e325d504","Type":"ContainerStarted","Data":"261ad5f2b23ee35a3162d445f2a06a2316cc66e18471921e6d270f1c56bba406"} Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.648262 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4cgbp" event={"ID":"7771668e-6442-46eb-a2a5-53fd35396ef4","Type":"ContainerStarted","Data":"b02abb26d3a228f98653540fcc0649d306766402751e7e55bae52cd794506eac"} Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.653821 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" podStartSLOduration=121.653801479 podStartE2EDuration="2m1.653801479s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:48.650279983 +0000 UTC m=+146.193535629" watchObservedRunningTime="2026-02-17 09:07:48.653801479 +0000 UTC m=+146.197057125" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.656982 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-6rbxt" event={"ID":"d5ece5ca-10cc-4597-b14e-60eb9091c7ff","Type":"ContainerStarted","Data":"e37a0fe183672db758d4faf9d87229414505fba37417b1d9f3a240533252dc6d"} Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.663209 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lf2lr" podStartSLOduration=121.663193386 podStartE2EDuration="2m1.663193386s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:48.630751199 +0000 UTC m=+146.174006835" watchObservedRunningTime="2026-02-17 09:07:48.663193386 +0000 UTC m=+146.206449042" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.664238 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wlh58" event={"ID":"cd27b6bd-2fc3-4fb3-8187-d08fccb41c82","Type":"ContainerStarted","Data":"5fc43b505a55c4e60c8af4d3d529c2c8294283da780f9f3217a67e7c1a2070c2"} Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.665429 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-86pgj" podStartSLOduration=121.665422147 podStartE2EDuration="2m1.665422147s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:48.663031591 +0000 UTC m=+146.206287237" watchObservedRunningTime="2026-02-17 09:07:48.665422147 +0000 UTC m=+146.208677803" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.681807 4848 patch_prober.go:28] interesting pod/router-default-5444994796-gxh7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 09:07:48 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Feb 17 09:07:48 crc kubenswrapper[4848]: [+]process-running ok Feb 17 09:07:48 crc kubenswrapper[4848]: healthz check failed Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.681858 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gxh7z" podUID="414280b4-3299-4fee-a33c-a231b66000c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.685838 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-blj8p" event={"ID":"0a5ecbfe-7c94-4d35-a911-9c82ba255d04","Type":"ContainerStarted","Data":"254e8dd4d94435b06383569f24b9c3a6ca47509ccc9b5ae59b8f849ef8fea44a"} Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.687191 4848 patch_prober.go:28] interesting pod/console-operator-58897d9998-48d4r container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/readyz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.687230 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-48d4r" podUID="4baeac0b-e75b-49b5-9206-4cd99a4764f6" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/readyz\": dial tcp 10.217.0.27:8443: connect: connection refused" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.690335 4848 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-t68jc container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.690374 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t68jc" podUID="7f794184-a546-44a9-ab3b-47d69b306384" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.691742 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-47hsw" podStartSLOduration=121.691724216 podStartE2EDuration="2m1.691724216s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:48.691037777 +0000 UTC m=+146.234293433" watchObservedRunningTime="2026-02-17 09:07:48.691724216 +0000 UTC m=+146.234979862" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.720161 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:48 crc kubenswrapper[4848]: E0217 09:07:48.740594 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:49.240580252 +0000 UTC m=+146.783835898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.771354 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.771396 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.790447 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-6rbxt" podStartSLOduration=121.790427345 podStartE2EDuration="2m1.790427345s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:48.785150961 +0000 UTC m=+146.328406607" watchObservedRunningTime="2026-02-17 09:07:48.790427345 +0000 UTC m=+146.333682991" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.792261 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dx9bs" podStartSLOduration=121.792252835 podStartE2EDuration="2m1.792252835s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:48.736510601 +0000 UTC m=+146.279766247" watchObservedRunningTime="2026-02-17 09:07:48.792252835 +0000 UTC m=+146.335508481" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.821478 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:48 crc kubenswrapper[4848]: E0217 09:07:48.822582 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:49.322566954 +0000 UTC m=+146.865822600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.841318 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-4cgbp" podStartSLOduration=121.841302726 podStartE2EDuration="2m1.841302726s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:48.819170641 +0000 UTC m=+146.362426287" watchObservedRunningTime="2026-02-17 09:07:48.841302726 +0000 UTC m=+146.384558372" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.854602 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-blj8p" podStartSLOduration=6.854587649 podStartE2EDuration="6.854587649s" podCreationTimestamp="2026-02-17 09:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:48.853415857 +0000 UTC m=+146.396671493" watchObservedRunningTime="2026-02-17 09:07:48.854587649 +0000 UTC m=+146.397843295" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.884246 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wlh58" podStartSLOduration=121.884228209 podStartE2EDuration="2m1.884228209s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:48.883462328 +0000 UTC m=+146.426717994" watchObservedRunningTime="2026-02-17 09:07:48.884228209 +0000 UTC m=+146.427483855" Feb 17 09:07:48 crc kubenswrapper[4848]: I0217 09:07:48.931556 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:48 crc kubenswrapper[4848]: E0217 09:07:48.931851 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:49.431840201 +0000 UTC m=+146.975095847 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.032220 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:49 crc kubenswrapper[4848]: E0217 09:07:49.032405 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:49.53237731 +0000 UTC m=+147.075632956 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.032539 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:49 crc kubenswrapper[4848]: E0217 09:07:49.032813 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:49.532802461 +0000 UTC m=+147.076058107 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.133411 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:49 crc kubenswrapper[4848]: E0217 09:07:49.133560 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:49.633534316 +0000 UTC m=+147.176789962 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.133743 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:49 crc kubenswrapper[4848]: E0217 09:07:49.134057 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:49.63404741 +0000 UTC m=+147.177303056 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.235399 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:49 crc kubenswrapper[4848]: E0217 09:07:49.235531 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:49.735513264 +0000 UTC m=+147.278768910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.235618 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:49 crc kubenswrapper[4848]: E0217 09:07:49.235954 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:49.735946746 +0000 UTC m=+147.279202382 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.337075 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:49 crc kubenswrapper[4848]: E0217 09:07:49.337267 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:49.837240746 +0000 UTC m=+147.380496392 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.337559 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:49 crc kubenswrapper[4848]: E0217 09:07:49.337879 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:49.837866193 +0000 UTC m=+147.381121839 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.446174 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:49 crc kubenswrapper[4848]: E0217 09:07:49.446786 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:49.946769241 +0000 UTC m=+147.490024887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.510632 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.510691 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.548240 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:49 crc kubenswrapper[4848]: E0217 09:07:49.548677 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:50.048656697 +0000 UTC m=+147.591912423 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.649627 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:49 crc kubenswrapper[4848]: E0217 09:07:49.649832 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:50.149804853 +0000 UTC m=+147.693060499 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.650187 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:49 crc kubenswrapper[4848]: E0217 09:07:49.650532 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:50.150520373 +0000 UTC m=+147.693776019 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.680289 4848 patch_prober.go:28] interesting pod/router-default-5444994796-gxh7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 09:07:49 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Feb 17 09:07:49 crc kubenswrapper[4848]: [+]process-running ok Feb 17 09:07:49 crc kubenswrapper[4848]: healthz check failed Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.680328 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gxh7z" podUID="414280b4-3299-4fee-a33c-a231b66000c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.691714 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d4p2l" event={"ID":"93141f54-9595-4160-828c-f09251e540b7","Type":"ContainerStarted","Data":"a2d3bc9300e4d7b0ecba11f11ec4a97b3779d37d1719e9a9707cbfcc42310539"} Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.693204 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wks4t" event={"ID":"15500d79-d7c7-4a0c-965d-4783f4a85b2c","Type":"ContainerStarted","Data":"1c5f956d9297b42a349c314598522e0ff6ff3922aaca5200343247434e883fc6"} Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.695106 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-djpdz" event={"ID":"8bcf7ab2-ac9d-4767-b1e1-24e89c8204a4","Type":"ContainerStarted","Data":"bfeb23817b932ba38c6cd7bf0cd53b854d4565ba06e71c45d2b9c4687fdf1ded"} Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.695159 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-djpdz" Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.696523 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-q62cm" event={"ID":"6e53ded1-38a3-4129-bdfb-e0ef6ca1b748","Type":"ContainerStarted","Data":"cd75ab8aa03936055f69a93f4a3cd7305fb074b04f249b7175fed62b397befe2"} Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.698659 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" event={"ID":"b1482be7-a50a-43ca-974a-d49cad628e46","Type":"ContainerStarted","Data":"74d3a0d612bde9447a904df15e6f42be875a55cd5c5ccd0ff7c8ddd36aba74ad"} Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.698685 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" event={"ID":"b1482be7-a50a-43ca-974a-d49cad628e46","Type":"ContainerStarted","Data":"8101e36808f68923df9176b28cb1b57ede7655911c9ec18c1df8fc77ed86ea0d"} Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.700106 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xdqnq" event={"ID":"dd116ca8-9aec-4817-a7b2-a242e01a0a2e","Type":"ContainerStarted","Data":"39005e843f8dbbb109699e3a8ae24380d38b1bcbfe243bb7a4aa8109a61a8d3c"} Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.701390 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" event={"ID":"7e326ca3-7185-4e3b-a023-6ccbf1214457","Type":"ContainerStarted","Data":"9f916a4cc927cea104c600ae7298b3336f897cc453dde91e58c7a195fb419dfa"} Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.703290 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7gvvt" event={"ID":"06e92d67-efb3-4cc4-b304-9543942bc23c","Type":"ContainerStarted","Data":"59204c9da99421df6cd45aea8752c99af0663b51feb651a82241efcf84ab4e4e"} Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.703906 4848 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dx9bs container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.703951 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dx9bs" podUID="1f199330-d302-4a0f-8b32-8a334c243125" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.704453 4848 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6xbj4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.704484 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" podUID="b0fba000-75c8-49be-945e-fc41fabf370c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.704792 4848 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-mq695 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" start-of-body= Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.704825 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-mq695" podUID="3afb3ce4-f468-4042-b4d5-61285893e7e1" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.24:8080/healthz\": dial tcp 10.217.0.24:8080: connect: connection refused" Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.704983 4848 patch_prober.go:28] interesting pod/console-operator-58897d9998-48d4r container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/readyz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.705027 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-48d4r" podUID="4baeac0b-e75b-49b5-9206-4cd99a4764f6" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/readyz\": dial tcp 10.217.0.27:8443: connect: connection refused" Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.704983 4848 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-lf2lr container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:5443/healthz\": dial tcp 10.217.0.32:5443: connect: connection refused" start-of-body= Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.705077 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lf2lr" podUID="e45c378d-04fe-48ff-87c9-6fcc02ede1b9" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.32:5443/healthz\": dial tcp 10.217.0.32:5443: connect: connection refused" Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.705148 4848 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-t68jc container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.705169 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t68jc" podUID="7f794184-a546-44a9-ab3b-47d69b306384" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.730051 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-d4p2l" podStartSLOduration=122.730031367 podStartE2EDuration="2m2.730031367s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:49.725483162 +0000 UTC m=+147.268738808" watchObservedRunningTime="2026-02-17 09:07:49.730031367 +0000 UTC m=+147.273287023" Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.751669 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:49 crc kubenswrapper[4848]: E0217 09:07:49.751887 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:50.251856703 +0000 UTC m=+147.795112349 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.754551 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:49 crc kubenswrapper[4848]: E0217 09:07:49.755561 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:50.255545524 +0000 UTC m=+147.798801220 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.779714 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" podStartSLOduration=122.779697615 podStartE2EDuration="2m2.779697615s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:49.764349815 +0000 UTC m=+147.307605471" watchObservedRunningTime="2026-02-17 09:07:49.779697615 +0000 UTC m=+147.322953261" Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.782204 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7gvvt" podStartSLOduration=7.782198353 podStartE2EDuration="7.782198353s" podCreationTimestamp="2026-02-17 09:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:49.778905643 +0000 UTC m=+147.322161289" watchObservedRunningTime="2026-02-17 09:07:49.782198353 +0000 UTC m=+147.325453999" Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.788693 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.789047 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.791078 4848 patch_prober.go:28] interesting pod/apiserver-76f77b778f-b8gqs container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.7:8443/livez\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.791128 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" podUID="b1482be7-a50a-43ca-974a-d49cad628e46" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.7:8443/livez\": dial tcp 10.217.0.7:8443: connect: connection refused" Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.843553 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-xdqnq" podStartSLOduration=122.8435391 podStartE2EDuration="2m2.8435391s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:49.841802073 +0000 UTC m=+147.385057729" watchObservedRunningTime="2026-02-17 09:07:49.8435391 +0000 UTC m=+147.386794746" Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.844434 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-q62cm" podStartSLOduration=122.844427305 podStartE2EDuration="2m2.844427305s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:49.81205983 +0000 UTC m=+147.355315476" watchObservedRunningTime="2026-02-17 09:07:49.844427305 +0000 UTC m=+147.387682951" Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.856905 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:49 crc kubenswrapper[4848]: E0217 09:07:49.857329 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:50.357309537 +0000 UTC m=+147.900565183 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.888406 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-wks4t" podStartSLOduration=122.888386387 podStartE2EDuration="2m2.888386387s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:49.860462773 +0000 UTC m=+147.403718419" watchObservedRunningTime="2026-02-17 09:07:49.888386387 +0000 UTC m=+147.431642033" Feb 17 09:07:49 crc kubenswrapper[4848]: I0217 09:07:49.958702 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:49 crc kubenswrapper[4848]: E0217 09:07:49.959070 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:50.459058829 +0000 UTC m=+148.002314475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:50 crc kubenswrapper[4848]: I0217 09:07:50.059272 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:50 crc kubenswrapper[4848]: E0217 09:07:50.059995 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:50.559979529 +0000 UTC m=+148.103235175 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:50 crc kubenswrapper[4848]: I0217 09:07:50.161334 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:50 crc kubenswrapper[4848]: E0217 09:07:50.161638 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:50.661626138 +0000 UTC m=+148.204881784 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:50 crc kubenswrapper[4848]: I0217 09:07:50.238630 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:50 crc kubenswrapper[4848]: I0217 09:07:50.262252 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:50 crc kubenswrapper[4848]: E0217 09:07:50.262594 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:50.762577419 +0000 UTC m=+148.305833065 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:50 crc kubenswrapper[4848]: I0217 09:07:50.282870 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-djpdz" podStartSLOduration=123.282841683 podStartE2EDuration="2m3.282841683s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:49.889784425 +0000 UTC m=+147.433040071" watchObservedRunningTime="2026-02-17 09:07:50.282841683 +0000 UTC m=+147.826097329" Feb 17 09:07:50 crc kubenswrapper[4848]: I0217 09:07:50.363612 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:50 crc kubenswrapper[4848]: E0217 09:07:50.363963 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:50.863951681 +0000 UTC m=+148.407207317 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:50 crc kubenswrapper[4848]: I0217 09:07:50.464357 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:50 crc kubenswrapper[4848]: E0217 09:07:50.464803 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:50.964788288 +0000 UTC m=+148.508043934 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:50 crc kubenswrapper[4848]: I0217 09:07:50.565830 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:50 crc kubenswrapper[4848]: E0217 09:07:50.566237 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:51.066220662 +0000 UTC m=+148.609476298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:50 crc kubenswrapper[4848]: I0217 09:07:50.667224 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:50 crc kubenswrapper[4848]: E0217 09:07:50.667600 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:51.167581243 +0000 UTC m=+148.710836899 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:50 crc kubenswrapper[4848]: I0217 09:07:50.681045 4848 patch_prober.go:28] interesting pod/router-default-5444994796-gxh7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 09:07:50 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Feb 17 09:07:50 crc kubenswrapper[4848]: [+]process-running ok Feb 17 09:07:50 crc kubenswrapper[4848]: healthz check failed Feb 17 09:07:50 crc kubenswrapper[4848]: I0217 09:07:50.681100 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gxh7z" podUID="414280b4-3299-4fee-a33c-a231b66000c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 09:07:50 crc kubenswrapper[4848]: I0217 09:07:50.709301 4848 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6xbj4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Feb 17 09:07:50 crc kubenswrapper[4848]: I0217 09:07:50.709347 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" podUID="b0fba000-75c8-49be-945e-fc41fabf370c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Feb 17 09:07:50 crc kubenswrapper[4848]: I0217 09:07:50.709459 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-7gvvt" Feb 17 09:07:50 crc kubenswrapper[4848]: I0217 09:07:50.732595 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-zw6lz" Feb 17 09:07:50 crc kubenswrapper[4848]: I0217 09:07:50.769708 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:50 crc kubenswrapper[4848]: E0217 09:07:50.770426 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:51.270409585 +0000 UTC m=+148.813665231 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:50 crc kubenswrapper[4848]: I0217 09:07:50.871819 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:50 crc kubenswrapper[4848]: E0217 09:07:50.872133 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:51.372118266 +0000 UTC m=+148.915373912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:50 crc kubenswrapper[4848]: I0217 09:07:50.973614 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:50 crc kubenswrapper[4848]: E0217 09:07:50.973981 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:51.473964481 +0000 UTC m=+149.017220127 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.074237 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:51 crc kubenswrapper[4848]: E0217 09:07:51.074651 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:51.574636164 +0000 UTC m=+149.117891810 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.175785 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:51 crc kubenswrapper[4848]: E0217 09:07:51.176076 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:51.676063377 +0000 UTC m=+149.219319023 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.277686 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.277887 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.277914 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.277937 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.277980 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:07:51 crc kubenswrapper[4848]: E0217 09:07:51.278841 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:51.778822547 +0000 UTC m=+149.322078193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.283047 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.283567 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.284012 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.290492 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.354952 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-lf2lr" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.379182 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:51 crc kubenswrapper[4848]: E0217 09:07:51.379510 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:51.87949252 +0000 UTC m=+149.422748156 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.407629 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.423865 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.436784 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.480657 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:51 crc kubenswrapper[4848]: E0217 09:07:51.480768 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:51.980729348 +0000 UTC m=+149.523984984 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.481109 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:51 crc kubenswrapper[4848]: E0217 09:07:51.481464 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:51.981455238 +0000 UTC m=+149.524710884 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.546726 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8rb4g"] Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.547610 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8rb4g" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.551186 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.581919 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:51 crc kubenswrapper[4848]: E0217 09:07:51.582240 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:52.082225104 +0000 UTC m=+149.625480750 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.638688 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8rb4g"] Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.683251 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38529e93-75d3-4b08-a3dc-939fab0cbf66-catalog-content\") pod \"community-operators-8rb4g\" (UID: \"38529e93-75d3-4b08-a3dc-939fab0cbf66\") " pod="openshift-marketplace/community-operators-8rb4g" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.683310 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.683389 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38529e93-75d3-4b08-a3dc-939fab0cbf66-utilities\") pod \"community-operators-8rb4g\" (UID: \"38529e93-75d3-4b08-a3dc-939fab0cbf66\") " pod="openshift-marketplace/community-operators-8rb4g" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.683425 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crzcq\" (UniqueName: \"kubernetes.io/projected/38529e93-75d3-4b08-a3dc-939fab0cbf66-kube-api-access-crzcq\") pod \"community-operators-8rb4g\" (UID: \"38529e93-75d3-4b08-a3dc-939fab0cbf66\") " pod="openshift-marketplace/community-operators-8rb4g" Feb 17 09:07:51 crc kubenswrapper[4848]: E0217 09:07:51.683703 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:52.183687528 +0000 UTC m=+149.726943174 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.691968 4848 patch_prober.go:28] interesting pod/router-default-5444994796-gxh7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 09:07:51 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Feb 17 09:07:51 crc kubenswrapper[4848]: [+]process-running ok Feb 17 09:07:51 crc kubenswrapper[4848]: healthz check failed Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.692032 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gxh7z" podUID="414280b4-3299-4fee-a33c-a231b66000c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.730019 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4wt92"] Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.731008 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4wt92" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.731891 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" event={"ID":"7e326ca3-7185-4e3b-a023-6ccbf1214457","Type":"ContainerStarted","Data":"bd997e243946800b22a07b02d4e3544deaca5969287b96f520c8aaf3ecdec554"} Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.734358 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.758298 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4wt92"] Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.784432 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.784624 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38529e93-75d3-4b08-a3dc-939fab0cbf66-utilities\") pod \"community-operators-8rb4g\" (UID: \"38529e93-75d3-4b08-a3dc-939fab0cbf66\") " pod="openshift-marketplace/community-operators-8rb4g" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.784673 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09ac4713-0dcb-4908-8063-b6e029c132d7-catalog-content\") pod \"certified-operators-4wt92\" (UID: \"09ac4713-0dcb-4908-8063-b6e029c132d7\") " pod="openshift-marketplace/certified-operators-4wt92" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.784734 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crzcq\" (UniqueName: \"kubernetes.io/projected/38529e93-75d3-4b08-a3dc-939fab0cbf66-kube-api-access-crzcq\") pod \"community-operators-8rb4g\" (UID: \"38529e93-75d3-4b08-a3dc-939fab0cbf66\") " pod="openshift-marketplace/community-operators-8rb4g" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.784804 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09ac4713-0dcb-4908-8063-b6e029c132d7-utilities\") pod \"certified-operators-4wt92\" (UID: \"09ac4713-0dcb-4908-8063-b6e029c132d7\") " pod="openshift-marketplace/certified-operators-4wt92" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.784822 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87hq6\" (UniqueName: \"kubernetes.io/projected/09ac4713-0dcb-4908-8063-b6e029c132d7-kube-api-access-87hq6\") pod \"certified-operators-4wt92\" (UID: \"09ac4713-0dcb-4908-8063-b6e029c132d7\") " pod="openshift-marketplace/certified-operators-4wt92" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.784881 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38529e93-75d3-4b08-a3dc-939fab0cbf66-catalog-content\") pod \"community-operators-8rb4g\" (UID: \"38529e93-75d3-4b08-a3dc-939fab0cbf66\") " pod="openshift-marketplace/community-operators-8rb4g" Feb 17 09:07:51 crc kubenswrapper[4848]: E0217 09:07:51.785419 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:52.285403139 +0000 UTC m=+149.828658795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.787034 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38529e93-75d3-4b08-a3dc-939fab0cbf66-utilities\") pod \"community-operators-8rb4g\" (UID: \"38529e93-75d3-4b08-a3dc-939fab0cbf66\") " pod="openshift-marketplace/community-operators-8rb4g" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.789159 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38529e93-75d3-4b08-a3dc-939fab0cbf66-catalog-content\") pod \"community-operators-8rb4g\" (UID: \"38529e93-75d3-4b08-a3dc-939fab0cbf66\") " pod="openshift-marketplace/community-operators-8rb4g" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.854992 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crzcq\" (UniqueName: \"kubernetes.io/projected/38529e93-75d3-4b08-a3dc-939fab0cbf66-kube-api-access-crzcq\") pod \"community-operators-8rb4g\" (UID: \"38529e93-75d3-4b08-a3dc-939fab0cbf66\") " pod="openshift-marketplace/community-operators-8rb4g" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.880101 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8rb4g" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.886087 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87hq6\" (UniqueName: \"kubernetes.io/projected/09ac4713-0dcb-4908-8063-b6e029c132d7-kube-api-access-87hq6\") pod \"certified-operators-4wt92\" (UID: \"09ac4713-0dcb-4908-8063-b6e029c132d7\") " pod="openshift-marketplace/certified-operators-4wt92" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.886151 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.886192 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09ac4713-0dcb-4908-8063-b6e029c132d7-catalog-content\") pod \"certified-operators-4wt92\" (UID: \"09ac4713-0dcb-4908-8063-b6e029c132d7\") " pod="openshift-marketplace/certified-operators-4wt92" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.886222 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09ac4713-0dcb-4908-8063-b6e029c132d7-utilities\") pod \"certified-operators-4wt92\" (UID: \"09ac4713-0dcb-4908-8063-b6e029c132d7\") " pod="openshift-marketplace/certified-operators-4wt92" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.886558 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09ac4713-0dcb-4908-8063-b6e029c132d7-utilities\") pod \"certified-operators-4wt92\" (UID: \"09ac4713-0dcb-4908-8063-b6e029c132d7\") " pod="openshift-marketplace/certified-operators-4wt92" Feb 17 09:07:51 crc kubenswrapper[4848]: E0217 09:07:51.887243 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:52.387232044 +0000 UTC m=+149.930487690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.887572 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09ac4713-0dcb-4908-8063-b6e029c132d7-catalog-content\") pod \"certified-operators-4wt92\" (UID: \"09ac4713-0dcb-4908-8063-b6e029c132d7\") " pod="openshift-marketplace/certified-operators-4wt92" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.927398 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-v548h"] Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.928666 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v548h" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.934240 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87hq6\" (UniqueName: \"kubernetes.io/projected/09ac4713-0dcb-4908-8063-b6e029c132d7-kube-api-access-87hq6\") pod \"certified-operators-4wt92\" (UID: \"09ac4713-0dcb-4908-8063-b6e029c132d7\") " pod="openshift-marketplace/certified-operators-4wt92" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.987564 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:51 crc kubenswrapper[4848]: E0217 09:07:51.987723 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:52.487696611 +0000 UTC m=+150.030952257 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.988094 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn6pg\" (UniqueName: \"kubernetes.io/projected/3f9aa20a-a818-4f1e-a1e0-345ea27c1832-kube-api-access-kn6pg\") pod \"community-operators-v548h\" (UID: \"3f9aa20a-a818-4f1e-a1e0-345ea27c1832\") " pod="openshift-marketplace/community-operators-v548h" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.988130 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f9aa20a-a818-4f1e-a1e0-345ea27c1832-catalog-content\") pod \"community-operators-v548h\" (UID: \"3f9aa20a-a818-4f1e-a1e0-345ea27c1832\") " pod="openshift-marketplace/community-operators-v548h" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.988193 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f9aa20a-a818-4f1e-a1e0-345ea27c1832-utilities\") pod \"community-operators-v548h\" (UID: \"3f9aa20a-a818-4f1e-a1e0-345ea27c1832\") " pod="openshift-marketplace/community-operators-v548h" Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.988255 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:51 crc kubenswrapper[4848]: E0217 09:07:51.988612 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:52.488596056 +0000 UTC m=+150.031851702 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:51 crc kubenswrapper[4848]: I0217 09:07:51.995730 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v548h"] Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.068583 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4wt92" Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.100487 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.100829 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f9aa20a-a818-4f1e-a1e0-345ea27c1832-utilities\") pod \"community-operators-v548h\" (UID: \"3f9aa20a-a818-4f1e-a1e0-345ea27c1832\") " pod="openshift-marketplace/community-operators-v548h" Feb 17 09:07:52 crc kubenswrapper[4848]: E0217 09:07:52.100928 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:52.600909707 +0000 UTC m=+150.144165353 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.100981 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn6pg\" (UniqueName: \"kubernetes.io/projected/3f9aa20a-a818-4f1e-a1e0-345ea27c1832-kube-api-access-kn6pg\") pod \"community-operators-v548h\" (UID: \"3f9aa20a-a818-4f1e-a1e0-345ea27c1832\") " pod="openshift-marketplace/community-operators-v548h" Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.101003 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f9aa20a-a818-4f1e-a1e0-345ea27c1832-catalog-content\") pod \"community-operators-v548h\" (UID: \"3f9aa20a-a818-4f1e-a1e0-345ea27c1832\") " pod="openshift-marketplace/community-operators-v548h" Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.101492 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f9aa20a-a818-4f1e-a1e0-345ea27c1832-utilities\") pod \"community-operators-v548h\" (UID: \"3f9aa20a-a818-4f1e-a1e0-345ea27c1832\") " pod="openshift-marketplace/community-operators-v548h" Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.101559 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f9aa20a-a818-4f1e-a1e0-345ea27c1832-catalog-content\") pod \"community-operators-v548h\" (UID: \"3f9aa20a-a818-4f1e-a1e0-345ea27c1832\") " pod="openshift-marketplace/community-operators-v548h" Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.142090 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn6pg\" (UniqueName: \"kubernetes.io/projected/3f9aa20a-a818-4f1e-a1e0-345ea27c1832-kube-api-access-kn6pg\") pod \"community-operators-v548h\" (UID: \"3f9aa20a-a818-4f1e-a1e0-345ea27c1832\") " pod="openshift-marketplace/community-operators-v548h" Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.137978 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h7pln"] Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.143513 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h7pln" Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.170679 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h7pln"] Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.203457 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.203542 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9547ed7d-6e19-4a09-84f1-8afaae314251-catalog-content\") pod \"certified-operators-h7pln\" (UID: \"9547ed7d-6e19-4a09-84f1-8afaae314251\") " pod="openshift-marketplace/certified-operators-h7pln" Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.203584 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9547ed7d-6e19-4a09-84f1-8afaae314251-utilities\") pod \"certified-operators-h7pln\" (UID: \"9547ed7d-6e19-4a09-84f1-8afaae314251\") " pod="openshift-marketplace/certified-operators-h7pln" Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.203639 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk6d8\" (UniqueName: \"kubernetes.io/projected/9547ed7d-6e19-4a09-84f1-8afaae314251-kube-api-access-rk6d8\") pod \"certified-operators-h7pln\" (UID: \"9547ed7d-6e19-4a09-84f1-8afaae314251\") " pod="openshift-marketplace/certified-operators-h7pln" Feb 17 09:07:52 crc kubenswrapper[4848]: E0217 09:07:52.204022 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:52.704005476 +0000 UTC m=+150.247261122 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.275124 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v548h" Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.304553 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.305183 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk6d8\" (UniqueName: \"kubernetes.io/projected/9547ed7d-6e19-4a09-84f1-8afaae314251-kube-api-access-rk6d8\") pod \"certified-operators-h7pln\" (UID: \"9547ed7d-6e19-4a09-84f1-8afaae314251\") " pod="openshift-marketplace/certified-operators-h7pln" Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.305299 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9547ed7d-6e19-4a09-84f1-8afaae314251-catalog-content\") pod \"certified-operators-h7pln\" (UID: \"9547ed7d-6e19-4a09-84f1-8afaae314251\") " pod="openshift-marketplace/certified-operators-h7pln" Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.305335 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9547ed7d-6e19-4a09-84f1-8afaae314251-utilities\") pod \"certified-operators-h7pln\" (UID: \"9547ed7d-6e19-4a09-84f1-8afaae314251\") " pod="openshift-marketplace/certified-operators-h7pln" Feb 17 09:07:52 crc kubenswrapper[4848]: E0217 09:07:52.306064 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:52.806042016 +0000 UTC m=+150.349297712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.306258 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9547ed7d-6e19-4a09-84f1-8afaae314251-utilities\") pod \"certified-operators-h7pln\" (UID: \"9547ed7d-6e19-4a09-84f1-8afaae314251\") " pod="openshift-marketplace/certified-operators-h7pln" Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.306346 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9547ed7d-6e19-4a09-84f1-8afaae314251-catalog-content\") pod \"certified-operators-h7pln\" (UID: \"9547ed7d-6e19-4a09-84f1-8afaae314251\") " pod="openshift-marketplace/certified-operators-h7pln" Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.381253 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk6d8\" (UniqueName: \"kubernetes.io/projected/9547ed7d-6e19-4a09-84f1-8afaae314251-kube-api-access-rk6d8\") pod \"certified-operators-h7pln\" (UID: \"9547ed7d-6e19-4a09-84f1-8afaae314251\") " pod="openshift-marketplace/certified-operators-h7pln" Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.407998 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:52 crc kubenswrapper[4848]: E0217 09:07:52.408387 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-17 09:07:52.908371214 +0000 UTC m=+150.451626870 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-4hfc2" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.425382 4848 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.491048 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h7pln" Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.510363 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:52 crc kubenswrapper[4848]: E0217 09:07:52.510820 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-17 09:07:53.010800004 +0000 UTC m=+150.554055650 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.535852 4848 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-17T09:07:52.42540654Z","Handler":null,"Name":""} Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.551142 4848 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.551191 4848 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.613248 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.635667 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8rb4g"] Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.638638 4848 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.638675 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.701786 4848 patch_prober.go:28] interesting pod/router-default-5444994796-gxh7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 09:07:52 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Feb 17 09:07:52 crc kubenswrapper[4848]: [+]process-running ok Feb 17 09:07:52 crc kubenswrapper[4848]: healthz check failed Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.701827 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gxh7z" podUID="414280b4-3299-4fee-a33c-a231b66000c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.758181 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4wt92"] Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.798078 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-4hfc2\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.819553 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.830120 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9a3c6232a1ba50a90100d0713198ad0cd1d189125107a8830922d84ed3a4c9c7"} Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.841494 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"6bf04531a307d5f622b47d0f3c269d965423f1855caf0a031eb26b54e671ef3d"} Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.866803 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.873790 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-v548h"] Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.893405 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" event={"ID":"7e326ca3-7185-4e3b-a023-6ccbf1214457","Type":"ContainerStarted","Data":"3c922d398bdf17b76a603d213854e132d6e08006ea97ede1df017a8872f4903c"} Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.899428 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9f4cc207d6c86017406b0ffeef5eb70daf8eec6ad766b137e323ee1832f89b07"} Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.899468 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d2e6b6668a64fdda61093537de8ff54f17b83c86b594fec5030c3458e89714e1"} Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.900168 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:07:52 crc kubenswrapper[4848]: I0217 09:07:52.902722 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rb4g" event={"ID":"38529e93-75d3-4b08-a3dc-939fab0cbf66","Type":"ContainerStarted","Data":"c479c8c4c09008225dffab0f49b8e5c1b34d719df315f3fecde26faff08c1db2"} Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.066252 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.400937 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.403546 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h7pln"] Feb 17 09:07:53 crc kubenswrapper[4848]: W0217 09:07:53.431502 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9547ed7d_6e19_4a09_84f1_8afaae314251.slice/crio-3e951031add0e1553c2ad5ccdf955c33cb529793e8e61edfbf8808e4cf450ed5 WatchSource:0}: Error finding container 3e951031add0e1553c2ad5ccdf955c33cb529793e8e61edfbf8808e4cf450ed5: Status 404 returned error can't find the container with id 3e951031add0e1553c2ad5ccdf955c33cb529793e8e61edfbf8808e4cf450ed5 Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.538109 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4hfc2"] Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.548155 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.548929 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.555091 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.555303 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.561395 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.600977 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zhzdb" Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.648673 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c9d44019-ee7c-43bb-95d5-f7c75445c65b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c9d44019-ee7c-43bb-95d5-f7c75445c65b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.648853 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c9d44019-ee7c-43bb-95d5-f7c75445c65b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c9d44019-ee7c-43bb-95d5-f7c75445c65b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.685643 4848 patch_prober.go:28] interesting pod/router-default-5444994796-gxh7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 09:07:53 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Feb 17 09:07:53 crc kubenswrapper[4848]: [+]process-running ok Feb 17 09:07:53 crc kubenswrapper[4848]: healthz check failed Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.686043 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gxh7z" podUID="414280b4-3299-4fee-a33c-a231b66000c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.733550 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j9rm9"] Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.734939 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j9rm9" Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.738013 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.749599 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c9d44019-ee7c-43bb-95d5-f7c75445c65b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c9d44019-ee7c-43bb-95d5-f7c75445c65b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.749701 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c9d44019-ee7c-43bb-95d5-f7c75445c65b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c9d44019-ee7c-43bb-95d5-f7c75445c65b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.750292 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9rm9"] Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.750588 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c9d44019-ee7c-43bb-95d5-f7c75445c65b-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"c9d44019-ee7c-43bb-95d5-f7c75445c65b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.777980 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c9d44019-ee7c-43bb-95d5-f7c75445c65b-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"c9d44019-ee7c-43bb-95d5-f7c75445c65b\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.850704 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f224f79c-f6d1-442a-be23-4fc8e7527d3a-catalog-content\") pod \"redhat-marketplace-j9rm9\" (UID: \"f224f79c-f6d1-442a-be23-4fc8e7527d3a\") " pod="openshift-marketplace/redhat-marketplace-j9rm9" Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.850779 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f224f79c-f6d1-442a-be23-4fc8e7527d3a-utilities\") pod \"redhat-marketplace-j9rm9\" (UID: \"f224f79c-f6d1-442a-be23-4fc8e7527d3a\") " pod="openshift-marketplace/redhat-marketplace-j9rm9" Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.850814 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjr7q\" (UniqueName: \"kubernetes.io/projected/f224f79c-f6d1-442a-be23-4fc8e7527d3a-kube-api-access-fjr7q\") pod \"redhat-marketplace-j9rm9\" (UID: \"f224f79c-f6d1-442a-be23-4fc8e7527d3a\") " pod="openshift-marketplace/redhat-marketplace-j9rm9" Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.917267 4848 generic.go:334] "Generic (PLEG): container finished" podID="09ac4713-0dcb-4908-8063-b6e029c132d7" containerID="1781b99b2b68625c19a3b2171d2473bc06b17540859b0a637e0a57f87e8fee43" exitCode=0 Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.917498 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wt92" event={"ID":"09ac4713-0dcb-4908-8063-b6e029c132d7","Type":"ContainerDied","Data":"1781b99b2b68625c19a3b2171d2473bc06b17540859b0a637e0a57f87e8fee43"} Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.917575 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wt92" event={"ID":"09ac4713-0dcb-4908-8063-b6e029c132d7","Type":"ContainerStarted","Data":"33f0aaca82ba82440176ffed2342fc83689e9a9796c5a5fdaeab783480d199b1"} Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.921031 4848 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.926626 4848 generic.go:334] "Generic (PLEG): container finished" podID="38529e93-75d3-4b08-a3dc-939fab0cbf66" containerID="23c7a51a08b983d84c8f2d19c1860f417a1f6583e065fe1609b0e1e2c0dd98e6" exitCode=0 Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.926701 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rb4g" event={"ID":"38529e93-75d3-4b08-a3dc-939fab0cbf66","Type":"ContainerDied","Data":"23c7a51a08b983d84c8f2d19c1860f417a1f6583e065fe1609b0e1e2c0dd98e6"} Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.937353 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"cd495bac222a14dfd6dbfcdcd116972e525c6668adb18cd797eda67bb462a642"} Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.955443 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f224f79c-f6d1-442a-be23-4fc8e7527d3a-catalog-content\") pod \"redhat-marketplace-j9rm9\" (UID: \"f224f79c-f6d1-442a-be23-4fc8e7527d3a\") " pod="openshift-marketplace/redhat-marketplace-j9rm9" Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.955486 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f224f79c-f6d1-442a-be23-4fc8e7527d3a-utilities\") pod \"redhat-marketplace-j9rm9\" (UID: \"f224f79c-f6d1-442a-be23-4fc8e7527d3a\") " pod="openshift-marketplace/redhat-marketplace-j9rm9" Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.955517 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjr7q\" (UniqueName: \"kubernetes.io/projected/f224f79c-f6d1-442a-be23-4fc8e7527d3a-kube-api-access-fjr7q\") pod \"redhat-marketplace-j9rm9\" (UID: \"f224f79c-f6d1-442a-be23-4fc8e7527d3a\") " pod="openshift-marketplace/redhat-marketplace-j9rm9" Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.957254 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f224f79c-f6d1-442a-be23-4fc8e7527d3a-utilities\") pod \"redhat-marketplace-j9rm9\" (UID: \"f224f79c-f6d1-442a-be23-4fc8e7527d3a\") " pod="openshift-marketplace/redhat-marketplace-j9rm9" Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.957310 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f224f79c-f6d1-442a-be23-4fc8e7527d3a-catalog-content\") pod \"redhat-marketplace-j9rm9\" (UID: \"f224f79c-f6d1-442a-be23-4fc8e7527d3a\") " pod="openshift-marketplace/redhat-marketplace-j9rm9" Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.957936 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7pln" event={"ID":"9547ed7d-6e19-4a09-84f1-8afaae314251","Type":"ContainerStarted","Data":"b530442e931e6fecdd5aec7985c770e288acec5fc324fc51609e0ea2390c9faa"} Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.957976 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7pln" event={"ID":"9547ed7d-6e19-4a09-84f1-8afaae314251","Type":"ContainerStarted","Data":"3e951031add0e1553c2ad5ccdf955c33cb529793e8e61edfbf8808e4cf450ed5"} Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.964331 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ee50e5b988b9fda564c7279f717eafdbbc595f84364b32ac9f03fefaf9306f04"} Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.967053 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.974480 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" event={"ID":"7e326ca3-7185-4e3b-a023-6ccbf1214457","Type":"ContainerStarted","Data":"d4f9767afa1d58ffcaeb52f7781f3ecf0e072a12fa28eec7aedb0b334ed296b8"} Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.979560 4848 generic.go:334] "Generic (PLEG): container finished" podID="3f9aa20a-a818-4f1e-a1e0-345ea27c1832" containerID="01ac18f970beedb1abd2bdaa5e33862ad28000bfd893a0f871c460a5e3251117" exitCode=0 Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.979632 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v548h" event={"ID":"3f9aa20a-a818-4f1e-a1e0-345ea27c1832","Type":"ContainerDied","Data":"01ac18f970beedb1abd2bdaa5e33862ad28000bfd893a0f871c460a5e3251117"} Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.979652 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v548h" event={"ID":"3f9aa20a-a818-4f1e-a1e0-345ea27c1832","Type":"ContainerStarted","Data":"2a998c8b5d6418b84de119da9688188c68b2612b11c783469aee37ac845aaec6"} Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.982672 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" event={"ID":"e29aabb7-fe3b-4887-a00d-535144b46d4b","Type":"ContainerStarted","Data":"10e8c25c7a42a2caab60e4ea48c938ead49a84cd212a67475f093407457d936b"} Feb 17 09:07:53 crc kubenswrapper[4848]: I0217 09:07:53.989622 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjr7q\" (UniqueName: \"kubernetes.io/projected/f224f79c-f6d1-442a-be23-4fc8e7527d3a-kube-api-access-fjr7q\") pod \"redhat-marketplace-j9rm9\" (UID: \"f224f79c-f6d1-442a-be23-4fc8e7527d3a\") " pod="openshift-marketplace/redhat-marketplace-j9rm9" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.059214 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j9rm9" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.089980 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-xc9l6" podStartSLOduration=12.089963085 podStartE2EDuration="12.089963085s" podCreationTimestamp="2026-02-17 09:07:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:54.034059727 +0000 UTC m=+151.577315373" watchObservedRunningTime="2026-02-17 09:07:54.089963085 +0000 UTC m=+151.633218731" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.122009 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-q8fsn"] Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.122978 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q8fsn" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.139203 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q8fsn"] Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.263540 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c90e73c-c31d-4c69-a555-f191b15f8cb7-utilities\") pod \"redhat-marketplace-q8fsn\" (UID: \"7c90e73c-c31d-4c69-a555-f191b15f8cb7\") " pod="openshift-marketplace/redhat-marketplace-q8fsn" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.263633 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgrc9\" (UniqueName: \"kubernetes.io/projected/7c90e73c-c31d-4c69-a555-f191b15f8cb7-kube-api-access-rgrc9\") pod \"redhat-marketplace-q8fsn\" (UID: \"7c90e73c-c31d-4c69-a555-f191b15f8cb7\") " pod="openshift-marketplace/redhat-marketplace-q8fsn" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.263681 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c90e73c-c31d-4c69-a555-f191b15f8cb7-catalog-content\") pod \"redhat-marketplace-q8fsn\" (UID: \"7c90e73c-c31d-4c69-a555-f191b15f8cb7\") " pod="openshift-marketplace/redhat-marketplace-q8fsn" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.365223 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgrc9\" (UniqueName: \"kubernetes.io/projected/7c90e73c-c31d-4c69-a555-f191b15f8cb7-kube-api-access-rgrc9\") pod \"redhat-marketplace-q8fsn\" (UID: \"7c90e73c-c31d-4c69-a555-f191b15f8cb7\") " pod="openshift-marketplace/redhat-marketplace-q8fsn" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.365571 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c90e73c-c31d-4c69-a555-f191b15f8cb7-catalog-content\") pod \"redhat-marketplace-q8fsn\" (UID: \"7c90e73c-c31d-4c69-a555-f191b15f8cb7\") " pod="openshift-marketplace/redhat-marketplace-q8fsn" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.365594 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c90e73c-c31d-4c69-a555-f191b15f8cb7-utilities\") pod \"redhat-marketplace-q8fsn\" (UID: \"7c90e73c-c31d-4c69-a555-f191b15f8cb7\") " pod="openshift-marketplace/redhat-marketplace-q8fsn" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.366058 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c90e73c-c31d-4c69-a555-f191b15f8cb7-utilities\") pod \"redhat-marketplace-q8fsn\" (UID: \"7c90e73c-c31d-4c69-a555-f191b15f8cb7\") " pod="openshift-marketplace/redhat-marketplace-q8fsn" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.366645 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c90e73c-c31d-4c69-a555-f191b15f8cb7-catalog-content\") pod \"redhat-marketplace-q8fsn\" (UID: \"7c90e73c-c31d-4c69-a555-f191b15f8cb7\") " pod="openshift-marketplace/redhat-marketplace-q8fsn" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.390704 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgrc9\" (UniqueName: \"kubernetes.io/projected/7c90e73c-c31d-4c69-a555-f191b15f8cb7-kube-api-access-rgrc9\") pod \"redhat-marketplace-q8fsn\" (UID: \"7c90e73c-c31d-4c69-a555-f191b15f8cb7\") " pod="openshift-marketplace/redhat-marketplace-q8fsn" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.473610 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q8fsn" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.476435 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9rm9"] Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.535266 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.535323 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.538734 4848 patch_prober.go:28] interesting pod/console-f9d7485db-9h2hf container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.538789 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-9h2hf" podUID="a9b13597-8879-40d4-965b-580222915295" containerName="console" probeResult="failure" output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.544668 4848 patch_prober.go:28] interesting pod/downloads-7954f5f757-7vmx8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.544701 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7vmx8" podUID="575767dd-6121-4745-aae9-c5434aee72d5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.544672 4848 patch_prober.go:28] interesting pod/downloads-7954f5f757-7vmx8 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.544739 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-7vmx8" podUID="575767dd-6121-4745-aae9-c5434aee72d5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.581883 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 17 09:07:54 crc kubenswrapper[4848]: W0217 09:07:54.589660 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc9d44019_ee7c_43bb_95d5_f7c75445c65b.slice/crio-57b0916ec5b1b118bb7814fc28477933bdccca14eba1fc271e78accdff82c0c6 WatchSource:0}: Error finding container 57b0916ec5b1b118bb7814fc28477933bdccca14eba1fc271e78accdff82c0c6: Status 404 returned error can't find the container with id 57b0916ec5b1b118bb7814fc28477933bdccca14eba1fc271e78accdff82c0c6 Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.676231 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-gxh7z" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.679085 4848 patch_prober.go:28] interesting pod/router-default-5444994796-gxh7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 09:07:54 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Feb 17 09:07:54 crc kubenswrapper[4848]: [+]process-running ok Feb 17 09:07:54 crc kubenswrapper[4848]: healthz check failed Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.679132 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gxh7z" podUID="414280b4-3299-4fee-a33c-a231b66000c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.699258 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-q8fsn"] Feb 17 09:07:54 crc kubenswrapper[4848]: W0217 09:07:54.711718 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c90e73c_c31d_4c69_a555_f191b15f8cb7.slice/crio-efe2a0e95c15dd0a3893079b920f9fe31f556b170abc521983c52d1526af36d3 WatchSource:0}: Error finding container efe2a0e95c15dd0a3893079b920f9fe31f556b170abc521983c52d1526af36d3: Status 404 returned error can't find the container with id efe2a0e95c15dd0a3893079b920f9fe31f556b170abc521983c52d1526af36d3 Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.728240 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-45twr"] Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.729474 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-45twr" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.731573 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.736622 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-45twr"] Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.814448 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.827997 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-b8gqs" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.883439 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlqg7\" (UniqueName: \"kubernetes.io/projected/b3a74de7-62f1-46c2-b518-6baa8b222b1b-kube-api-access-xlqg7\") pod \"redhat-operators-45twr\" (UID: \"b3a74de7-62f1-46c2-b518-6baa8b222b1b\") " pod="openshift-marketplace/redhat-operators-45twr" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.883872 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3a74de7-62f1-46c2-b518-6baa8b222b1b-utilities\") pod \"redhat-operators-45twr\" (UID: \"b3a74de7-62f1-46c2-b518-6baa8b222b1b\") " pod="openshift-marketplace/redhat-operators-45twr" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.883928 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3a74de7-62f1-46c2-b518-6baa8b222b1b-catalog-content\") pod \"redhat-operators-45twr\" (UID: \"b3a74de7-62f1-46c2-b518-6baa8b222b1b\") " pod="openshift-marketplace/redhat-operators-45twr" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.964746 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-48d4r" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.969500 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-t68jc" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.980380 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.986583 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlqg7\" (UniqueName: \"kubernetes.io/projected/b3a74de7-62f1-46c2-b518-6baa8b222b1b-kube-api-access-xlqg7\") pod \"redhat-operators-45twr\" (UID: \"b3a74de7-62f1-46c2-b518-6baa8b222b1b\") " pod="openshift-marketplace/redhat-operators-45twr" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.986699 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3a74de7-62f1-46c2-b518-6baa8b222b1b-utilities\") pod \"redhat-operators-45twr\" (UID: \"b3a74de7-62f1-46c2-b518-6baa8b222b1b\") " pod="openshift-marketplace/redhat-operators-45twr" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.986776 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3a74de7-62f1-46c2-b518-6baa8b222b1b-catalog-content\") pod \"redhat-operators-45twr\" (UID: \"b3a74de7-62f1-46c2-b518-6baa8b222b1b\") " pod="openshift-marketplace/redhat-operators-45twr" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.988379 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3a74de7-62f1-46c2-b518-6baa8b222b1b-utilities\") pod \"redhat-operators-45twr\" (UID: \"b3a74de7-62f1-46c2-b518-6baa8b222b1b\") " pod="openshift-marketplace/redhat-operators-45twr" Feb 17 09:07:54 crc kubenswrapper[4848]: I0217 09:07:54.988628 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3a74de7-62f1-46c2-b518-6baa8b222b1b-catalog-content\") pod \"redhat-operators-45twr\" (UID: \"b3a74de7-62f1-46c2-b518-6baa8b222b1b\") " pod="openshift-marketplace/redhat-operators-45twr" Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.007472 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c9d44019-ee7c-43bb-95d5-f7c75445c65b","Type":"ContainerStarted","Data":"57b0916ec5b1b118bb7814fc28477933bdccca14eba1fc271e78accdff82c0c6"} Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.021110 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" event={"ID":"e29aabb7-fe3b-4887-a00d-535144b46d4b","Type":"ContainerStarted","Data":"fcaddb0cdd07881363d2945ced691542cb8597560fe22f405e6114d33e3a4525"} Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.021664 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.044568 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlqg7\" (UniqueName: \"kubernetes.io/projected/b3a74de7-62f1-46c2-b518-6baa8b222b1b-kube-api-access-xlqg7\") pod \"redhat-operators-45twr\" (UID: \"b3a74de7-62f1-46c2-b518-6baa8b222b1b\") " pod="openshift-marketplace/redhat-operators-45twr" Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.044683 4848 generic.go:334] "Generic (PLEG): container finished" podID="f224f79c-f6d1-442a-be23-4fc8e7527d3a" containerID="f65d32f0f84f3013e2bbf6865a1cd0abb59d8b680ec7c5a9d9db009a7d9b42a4" exitCode=0 Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.044929 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9rm9" event={"ID":"f224f79c-f6d1-442a-be23-4fc8e7527d3a","Type":"ContainerDied","Data":"f65d32f0f84f3013e2bbf6865a1cd0abb59d8b680ec7c5a9d9db009a7d9b42a4"} Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.044987 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9rm9" event={"ID":"f224f79c-f6d1-442a-be23-4fc8e7527d3a","Type":"ContainerStarted","Data":"5bd4d5e9df75f7899de3623f31cd68265db667b82f244bcd0294608277e0407b"} Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.054928 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-45twr" Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.073392 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q8fsn" event={"ID":"7c90e73c-c31d-4c69-a555-f191b15f8cb7","Type":"ContainerStarted","Data":"efe2a0e95c15dd0a3893079b920f9fe31f556b170abc521983c52d1526af36d3"} Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.074767 4848 generic.go:334] "Generic (PLEG): container finished" podID="9547ed7d-6e19-4a09-84f1-8afaae314251" containerID="b530442e931e6fecdd5aec7985c770e288acec5fc324fc51609e0ea2390c9faa" exitCode=0 Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.074810 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7pln" event={"ID":"9547ed7d-6e19-4a09-84f1-8afaae314251","Type":"ContainerDied","Data":"b530442e931e6fecdd5aec7985c770e288acec5fc324fc51609e0ea2390c9faa"} Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.092062 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-mq695" Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.092252 4848 generic.go:334] "Generic (PLEG): container finished" podID="f4d0b3bb-c027-4390-92ac-66aad8bf0d19" containerID="e1825d86b0c3d5d7969f30c17dccb7437e7ab7ad20a72a76cc052bc896cef086" exitCode=0 Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.093089 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521980-llz2m" event={"ID":"f4d0b3bb-c027-4390-92ac-66aad8bf0d19","Type":"ContainerDied","Data":"e1825d86b0c3d5d7969f30c17dccb7437e7ab7ad20a72a76cc052bc896cef086"} Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.129694 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bdbtg"] Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.130750 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bdbtg" Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.138553 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" podStartSLOduration=128.138535688 podStartE2EDuration="2m8.138535688s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:07:55.13459411 +0000 UTC m=+152.677849756" watchObservedRunningTime="2026-02-17 09:07:55.138535688 +0000 UTC m=+152.681791344" Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.152659 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bdbtg"] Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.290696 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3c5c067-a917-4f03-9cd1-6c5b3c1d768a-utilities\") pod \"redhat-operators-bdbtg\" (UID: \"a3c5c067-a917-4f03-9cd1-6c5b3c1d768a\") " pod="openshift-marketplace/redhat-operators-bdbtg" Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.291092 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wnkj\" (UniqueName: \"kubernetes.io/projected/a3c5c067-a917-4f03-9cd1-6c5b3c1d768a-kube-api-access-7wnkj\") pod \"redhat-operators-bdbtg\" (UID: \"a3c5c067-a917-4f03-9cd1-6c5b3c1d768a\") " pod="openshift-marketplace/redhat-operators-bdbtg" Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.291228 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3c5c067-a917-4f03-9cd1-6c5b3c1d768a-catalog-content\") pod \"redhat-operators-bdbtg\" (UID: \"a3c5c067-a917-4f03-9cd1-6c5b3c1d768a\") " pod="openshift-marketplace/redhat-operators-bdbtg" Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.394500 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3c5c067-a917-4f03-9cd1-6c5b3c1d768a-catalog-content\") pod \"redhat-operators-bdbtg\" (UID: \"a3c5c067-a917-4f03-9cd1-6c5b3c1d768a\") " pod="openshift-marketplace/redhat-operators-bdbtg" Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.394594 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3c5c067-a917-4f03-9cd1-6c5b3c1d768a-utilities\") pod \"redhat-operators-bdbtg\" (UID: \"a3c5c067-a917-4f03-9cd1-6c5b3c1d768a\") " pod="openshift-marketplace/redhat-operators-bdbtg" Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.394695 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wnkj\" (UniqueName: \"kubernetes.io/projected/a3c5c067-a917-4f03-9cd1-6c5b3c1d768a-kube-api-access-7wnkj\") pod \"redhat-operators-bdbtg\" (UID: \"a3c5c067-a917-4f03-9cd1-6c5b3c1d768a\") " pod="openshift-marketplace/redhat-operators-bdbtg" Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.395422 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3c5c067-a917-4f03-9cd1-6c5b3c1d768a-utilities\") pod \"redhat-operators-bdbtg\" (UID: \"a3c5c067-a917-4f03-9cd1-6c5b3c1d768a\") " pod="openshift-marketplace/redhat-operators-bdbtg" Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.395433 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3c5c067-a917-4f03-9cd1-6c5b3c1d768a-catalog-content\") pod \"redhat-operators-bdbtg\" (UID: \"a3c5c067-a917-4f03-9cd1-6c5b3c1d768a\") " pod="openshift-marketplace/redhat-operators-bdbtg" Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.421481 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wnkj\" (UniqueName: \"kubernetes.io/projected/a3c5c067-a917-4f03-9cd1-6c5b3c1d768a-kube-api-access-7wnkj\") pod \"redhat-operators-bdbtg\" (UID: \"a3c5c067-a917-4f03-9cd1-6c5b3c1d768a\") " pod="openshift-marketplace/redhat-operators-bdbtg" Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.463434 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bdbtg" Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.589417 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-45twr"] Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.677804 4848 patch_prober.go:28] interesting pod/router-default-5444994796-gxh7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 09:07:55 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Feb 17 09:07:55 crc kubenswrapper[4848]: [+]process-running ok Feb 17 09:07:55 crc kubenswrapper[4848]: healthz check failed Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.678034 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gxh7z" podUID="414280b4-3299-4fee-a33c-a231b66000c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.702036 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dx9bs" Feb 17 09:07:55 crc kubenswrapper[4848]: I0217 09:07:55.949651 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bdbtg"] Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.101903 4848 generic.go:334] "Generic (PLEG): container finished" podID="c9d44019-ee7c-43bb-95d5-f7c75445c65b" containerID="eab15d2321dba38648a14d6bc976a8e064d3e48a57a50f14247af7237356969a" exitCode=0 Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.101966 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c9d44019-ee7c-43bb-95d5-f7c75445c65b","Type":"ContainerDied","Data":"eab15d2321dba38648a14d6bc976a8e064d3e48a57a50f14247af7237356969a"} Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.105854 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdbtg" event={"ID":"a3c5c067-a917-4f03-9cd1-6c5b3c1d768a","Type":"ContainerStarted","Data":"d7f37cf00f13bfa71b6a6fb43325db25bc30ae886626a42aaa08da81ff192cb0"} Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.108560 4848 generic.go:334] "Generic (PLEG): container finished" podID="7c90e73c-c31d-4c69-a555-f191b15f8cb7" containerID="090c05f3bfc42e71b6e900e50217ebcfd455c7d30bcda63c7efda213082da087" exitCode=0 Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.108619 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q8fsn" event={"ID":"7c90e73c-c31d-4c69-a555-f191b15f8cb7","Type":"ContainerDied","Data":"090c05f3bfc42e71b6e900e50217ebcfd455c7d30bcda63c7efda213082da087"} Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.113836 4848 generic.go:334] "Generic (PLEG): container finished" podID="b3a74de7-62f1-46c2-b518-6baa8b222b1b" containerID="a94d3b73f2531fafe04b2d4e95713cc1de43392f761af2a548ffa9c2ce941e6f" exitCode=0 Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.115778 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45twr" event={"ID":"b3a74de7-62f1-46c2-b518-6baa8b222b1b","Type":"ContainerDied","Data":"a94d3b73f2531fafe04b2d4e95713cc1de43392f761af2a548ffa9c2ce941e6f"} Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.116001 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45twr" event={"ID":"b3a74de7-62f1-46c2-b518-6baa8b222b1b","Type":"ContainerStarted","Data":"90b19a261bae92d72c069e469b412c718ce9bd6ff89a21e491c6a399336b8ca2"} Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.457881 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521980-llz2m" Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.521603 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gcxwv\" (UniqueName: \"kubernetes.io/projected/f4d0b3bb-c027-4390-92ac-66aad8bf0d19-kube-api-access-gcxwv\") pod \"f4d0b3bb-c027-4390-92ac-66aad8bf0d19\" (UID: \"f4d0b3bb-c027-4390-92ac-66aad8bf0d19\") " Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.521680 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4d0b3bb-c027-4390-92ac-66aad8bf0d19-secret-volume\") pod \"f4d0b3bb-c027-4390-92ac-66aad8bf0d19\" (UID: \"f4d0b3bb-c027-4390-92ac-66aad8bf0d19\") " Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.521751 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4d0b3bb-c027-4390-92ac-66aad8bf0d19-config-volume\") pod \"f4d0b3bb-c027-4390-92ac-66aad8bf0d19\" (UID: \"f4d0b3bb-c027-4390-92ac-66aad8bf0d19\") " Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.528429 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4d0b3bb-c027-4390-92ac-66aad8bf0d19-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f4d0b3bb-c027-4390-92ac-66aad8bf0d19" (UID: "f4d0b3bb-c027-4390-92ac-66aad8bf0d19"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.530150 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4d0b3bb-c027-4390-92ac-66aad8bf0d19-config-volume" (OuterVolumeSpecName: "config-volume") pod "f4d0b3bb-c027-4390-92ac-66aad8bf0d19" (UID: "f4d0b3bb-c027-4390-92ac-66aad8bf0d19"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.530962 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4d0b3bb-c027-4390-92ac-66aad8bf0d19-kube-api-access-gcxwv" (OuterVolumeSpecName: "kube-api-access-gcxwv") pod "f4d0b3bb-c027-4390-92ac-66aad8bf0d19" (UID: "f4d0b3bb-c027-4390-92ac-66aad8bf0d19"). InnerVolumeSpecName "kube-api-access-gcxwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.622653 4848 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f4d0b3bb-c027-4390-92ac-66aad8bf0d19-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.622692 4848 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f4d0b3bb-c027-4390-92ac-66aad8bf0d19-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.622704 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gcxwv\" (UniqueName: \"kubernetes.io/projected/f4d0b3bb-c027-4390-92ac-66aad8bf0d19-kube-api-access-gcxwv\") on node \"crc\" DevicePath \"\"" Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.678030 4848 patch_prober.go:28] interesting pod/router-default-5444994796-gxh7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 09:07:56 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Feb 17 09:07:56 crc kubenswrapper[4848]: [+]process-running ok Feb 17 09:07:56 crc kubenswrapper[4848]: healthz check failed Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.678109 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gxh7z" podUID="414280b4-3299-4fee-a33c-a231b66000c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.778065 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 09:07:56 crc kubenswrapper[4848]: E0217 09:07:56.778373 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4d0b3bb-c027-4390-92ac-66aad8bf0d19" containerName="collect-profiles" Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.778392 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4d0b3bb-c027-4390-92ac-66aad8bf0d19" containerName="collect-profiles" Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.778552 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4d0b3bb-c027-4390-92ac-66aad8bf0d19" containerName="collect-profiles" Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.778911 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.781210 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.781478 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.798030 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.824618 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.824659 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.927231 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.927683 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.927805 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 09:07:56 crc kubenswrapper[4848]: I0217 09:07:56.945164 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 09:07:57 crc kubenswrapper[4848]: I0217 09:07:57.117702 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 09:07:57 crc kubenswrapper[4848]: I0217 09:07:57.124029 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521980-llz2m" event={"ID":"f4d0b3bb-c027-4390-92ac-66aad8bf0d19","Type":"ContainerDied","Data":"aaefafae2f3e68a45cf2101536992717b93650276ff26294ef3d5ec6c023b45f"} Feb 17 09:07:57 crc kubenswrapper[4848]: I0217 09:07:57.124064 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521980-llz2m" Feb 17 09:07:57 crc kubenswrapper[4848]: I0217 09:07:57.124094 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aaefafae2f3e68a45cf2101536992717b93650276ff26294ef3d5ec6c023b45f" Feb 17 09:07:57 crc kubenswrapper[4848]: I0217 09:07:57.128240 4848 generic.go:334] "Generic (PLEG): container finished" podID="a3c5c067-a917-4f03-9cd1-6c5b3c1d768a" containerID="c761fbb1364d4a774383d65d14baf8c9753f68372cffb85d3c54a5e475d85693" exitCode=0 Feb 17 09:07:57 crc kubenswrapper[4848]: I0217 09:07:57.128321 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdbtg" event={"ID":"a3c5c067-a917-4f03-9cd1-6c5b3c1d768a","Type":"ContainerDied","Data":"c761fbb1364d4a774383d65d14baf8c9753f68372cffb85d3c54a5e475d85693"} Feb 17 09:07:57 crc kubenswrapper[4848]: I0217 09:07:57.452964 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 09:07:57 crc kubenswrapper[4848]: I0217 09:07:57.621661 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c9d44019-ee7c-43bb-95d5-f7c75445c65b-kubelet-dir\") pod \"c9d44019-ee7c-43bb-95d5-f7c75445c65b\" (UID: \"c9d44019-ee7c-43bb-95d5-f7c75445c65b\") " Feb 17 09:07:57 crc kubenswrapper[4848]: I0217 09:07:57.621718 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c9d44019-ee7c-43bb-95d5-f7c75445c65b-kube-api-access\") pod \"c9d44019-ee7c-43bb-95d5-f7c75445c65b\" (UID: \"c9d44019-ee7c-43bb-95d5-f7c75445c65b\") " Feb 17 09:07:57 crc kubenswrapper[4848]: I0217 09:07:57.625247 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9d44019-ee7c-43bb-95d5-f7c75445c65b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c9d44019-ee7c-43bb-95d5-f7c75445c65b" (UID: "c9d44019-ee7c-43bb-95d5-f7c75445c65b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:07:57 crc kubenswrapper[4848]: I0217 09:07:57.625704 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c9d44019-ee7c-43bb-95d5-f7c75445c65b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c9d44019-ee7c-43bb-95d5-f7c75445c65b" (UID: "c9d44019-ee7c-43bb-95d5-f7c75445c65b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:07:57 crc kubenswrapper[4848]: I0217 09:07:57.654894 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 17 09:07:57 crc kubenswrapper[4848]: I0217 09:07:57.680457 4848 patch_prober.go:28] interesting pod/router-default-5444994796-gxh7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 09:07:57 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Feb 17 09:07:57 crc kubenswrapper[4848]: [+]process-running ok Feb 17 09:07:57 crc kubenswrapper[4848]: healthz check failed Feb 17 09:07:57 crc kubenswrapper[4848]: I0217 09:07:57.680530 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gxh7z" podUID="414280b4-3299-4fee-a33c-a231b66000c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 09:07:57 crc kubenswrapper[4848]: W0217 09:07:57.709093 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8dc8bdc8_a6b9_4e53_8dc6_45d5614fbbac.slice/crio-a69efdf62df325cdebd17fb9c9bb7d03d66fab89ce95e8a721f9755c06e274f5 WatchSource:0}: Error finding container a69efdf62df325cdebd17fb9c9bb7d03d66fab89ce95e8a721f9755c06e274f5: Status 404 returned error can't find the container with id a69efdf62df325cdebd17fb9c9bb7d03d66fab89ce95e8a721f9755c06e274f5 Feb 17 09:07:57 crc kubenswrapper[4848]: I0217 09:07:57.723486 4848 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c9d44019-ee7c-43bb-95d5-f7c75445c65b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 09:07:57 crc kubenswrapper[4848]: I0217 09:07:57.723520 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c9d44019-ee7c-43bb-95d5-f7c75445c65b-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 09:07:58 crc kubenswrapper[4848]: I0217 09:07:58.140146 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac","Type":"ContainerStarted","Data":"a69efdf62df325cdebd17fb9c9bb7d03d66fab89ce95e8a721f9755c06e274f5"} Feb 17 09:07:58 crc kubenswrapper[4848]: I0217 09:07:58.153277 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"c9d44019-ee7c-43bb-95d5-f7c75445c65b","Type":"ContainerDied","Data":"57b0916ec5b1b118bb7814fc28477933bdccca14eba1fc271e78accdff82c0c6"} Feb 17 09:07:58 crc kubenswrapper[4848]: I0217 09:07:58.153319 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57b0916ec5b1b118bb7814fc28477933bdccca14eba1fc271e78accdff82c0c6" Feb 17 09:07:58 crc kubenswrapper[4848]: I0217 09:07:58.153401 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 17 09:07:58 crc kubenswrapper[4848]: I0217 09:07:58.679589 4848 patch_prober.go:28] interesting pod/router-default-5444994796-gxh7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 09:07:58 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Feb 17 09:07:58 crc kubenswrapper[4848]: [+]process-running ok Feb 17 09:07:58 crc kubenswrapper[4848]: healthz check failed Feb 17 09:07:58 crc kubenswrapper[4848]: I0217 09:07:58.679982 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gxh7z" podUID="414280b4-3299-4fee-a33c-a231b66000c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 09:07:59 crc kubenswrapper[4848]: I0217 09:07:59.237907 4848 generic.go:334] "Generic (PLEG): container finished" podID="8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac" containerID="7728240f4702e788ba22b010a27189cc6c3151660e61cf60ce3907412c68ce65" exitCode=0 Feb 17 09:07:59 crc kubenswrapper[4848]: I0217 09:07:59.237995 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac","Type":"ContainerDied","Data":"7728240f4702e788ba22b010a27189cc6c3151660e61cf60ce3907412c68ce65"} Feb 17 09:07:59 crc kubenswrapper[4848]: I0217 09:07:59.693920 4848 patch_prober.go:28] interesting pod/router-default-5444994796-gxh7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 09:07:59 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Feb 17 09:07:59 crc kubenswrapper[4848]: [+]process-running ok Feb 17 09:07:59 crc kubenswrapper[4848]: healthz check failed Feb 17 09:07:59 crc kubenswrapper[4848]: I0217 09:07:59.693996 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gxh7z" podUID="414280b4-3299-4fee-a33c-a231b66000c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 09:08:00 crc kubenswrapper[4848]: I0217 09:08:00.682667 4848 patch_prober.go:28] interesting pod/router-default-5444994796-gxh7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 09:08:00 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Feb 17 09:08:00 crc kubenswrapper[4848]: [+]process-running ok Feb 17 09:08:00 crc kubenswrapper[4848]: healthz check failed Feb 17 09:08:00 crc kubenswrapper[4848]: I0217 09:08:00.682990 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gxh7z" podUID="414280b4-3299-4fee-a33c-a231b66000c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 09:08:00 crc kubenswrapper[4848]: I0217 09:08:00.750997 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7gvvt" Feb 17 09:08:01 crc kubenswrapper[4848]: I0217 09:08:01.678000 4848 patch_prober.go:28] interesting pod/router-default-5444994796-gxh7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 09:08:01 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Feb 17 09:08:01 crc kubenswrapper[4848]: [+]process-running ok Feb 17 09:08:01 crc kubenswrapper[4848]: healthz check failed Feb 17 09:08:01 crc kubenswrapper[4848]: I0217 09:08:01.678369 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gxh7z" podUID="414280b4-3299-4fee-a33c-a231b66000c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 09:08:02 crc kubenswrapper[4848]: I0217 09:08:02.678629 4848 patch_prober.go:28] interesting pod/router-default-5444994796-gxh7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 09:08:02 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Feb 17 09:08:02 crc kubenswrapper[4848]: [+]process-running ok Feb 17 09:08:02 crc kubenswrapper[4848]: healthz check failed Feb 17 09:08:02 crc kubenswrapper[4848]: I0217 09:08:02.678679 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gxh7z" podUID="414280b4-3299-4fee-a33c-a231b66000c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 09:08:03 crc kubenswrapper[4848]: I0217 09:08:03.677605 4848 patch_prober.go:28] interesting pod/router-default-5444994796-gxh7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 09:08:03 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Feb 17 09:08:03 crc kubenswrapper[4848]: [+]process-running ok Feb 17 09:08:03 crc kubenswrapper[4848]: healthz check failed Feb 17 09:08:03 crc kubenswrapper[4848]: I0217 09:08:03.677661 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gxh7z" podUID="414280b4-3299-4fee-a33c-a231b66000c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 09:08:04 crc kubenswrapper[4848]: I0217 09:08:04.535206 4848 patch_prober.go:28] interesting pod/console-f9d7485db-9h2hf container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Feb 17 09:08:04 crc kubenswrapper[4848]: I0217 09:08:04.535493 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-9h2hf" podUID="a9b13597-8879-40d4-965b-580222915295" containerName="console" probeResult="failure" output="Get \"https://10.217.0.20:8443/health\": dial tcp 10.217.0.20:8443: connect: connection refused" Feb 17 09:08:04 crc kubenswrapper[4848]: I0217 09:08:04.545640 4848 patch_prober.go:28] interesting pod/downloads-7954f5f757-7vmx8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 17 09:08:04 crc kubenswrapper[4848]: I0217 09:08:04.545680 4848 patch_prober.go:28] interesting pod/downloads-7954f5f757-7vmx8 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 17 09:08:04 crc kubenswrapper[4848]: I0217 09:08:04.545700 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7vmx8" podUID="575767dd-6121-4745-aae9-c5434aee72d5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 17 09:08:04 crc kubenswrapper[4848]: I0217 09:08:04.545739 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-7vmx8" podUID="575767dd-6121-4745-aae9-c5434aee72d5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 17 09:08:04 crc kubenswrapper[4848]: I0217 09:08:04.677662 4848 patch_prober.go:28] interesting pod/router-default-5444994796-gxh7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 09:08:04 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Feb 17 09:08:04 crc kubenswrapper[4848]: [+]process-running ok Feb 17 09:08:04 crc kubenswrapper[4848]: healthz check failed Feb 17 09:08:04 crc kubenswrapper[4848]: I0217 09:08:04.677721 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gxh7z" podUID="414280b4-3299-4fee-a33c-a231b66000c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 09:08:05 crc kubenswrapper[4848]: I0217 09:08:05.678382 4848 patch_prober.go:28] interesting pod/router-default-5444994796-gxh7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 09:08:05 crc kubenswrapper[4848]: [-]has-synced failed: reason withheld Feb 17 09:08:05 crc kubenswrapper[4848]: [+]process-running ok Feb 17 09:08:05 crc kubenswrapper[4848]: healthz check failed Feb 17 09:08:05 crc kubenswrapper[4848]: I0217 09:08:05.678459 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gxh7z" podUID="414280b4-3299-4fee-a33c-a231b66000c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 09:08:05 crc kubenswrapper[4848]: I0217 09:08:05.918790 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 09:08:05 crc kubenswrapper[4848]: I0217 09:08:05.975740 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac-kube-api-access\") pod \"8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac\" (UID: \"8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac\") " Feb 17 09:08:05 crc kubenswrapper[4848]: I0217 09:08:05.975812 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac-kubelet-dir\") pod \"8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac\" (UID: \"8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac\") " Feb 17 09:08:05 crc kubenswrapper[4848]: I0217 09:08:05.975907 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac" (UID: "8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:08:05 crc kubenswrapper[4848]: I0217 09:08:05.976211 4848 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:05 crc kubenswrapper[4848]: I0217 09:08:05.980944 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac" (UID: "8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:08:06 crc kubenswrapper[4848]: I0217 09:08:06.077837 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:06 crc kubenswrapper[4848]: I0217 09:08:06.322887 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac","Type":"ContainerDied","Data":"a69efdf62df325cdebd17fb9c9bb7d03d66fab89ce95e8a721f9755c06e274f5"} Feb 17 09:08:06 crc kubenswrapper[4848]: I0217 09:08:06.322941 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a69efdf62df325cdebd17fb9c9bb7d03d66fab89ce95e8a721f9755c06e274f5" Feb 17 09:08:06 crc kubenswrapper[4848]: I0217 09:08:06.324495 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 17 09:08:06 crc kubenswrapper[4848]: I0217 09:08:06.678808 4848 patch_prober.go:28] interesting pod/router-default-5444994796-gxh7z container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 17 09:08:06 crc kubenswrapper[4848]: [+]has-synced ok Feb 17 09:08:06 crc kubenswrapper[4848]: [+]process-running ok Feb 17 09:08:06 crc kubenswrapper[4848]: healthz check failed Feb 17 09:08:06 crc kubenswrapper[4848]: I0217 09:08:06.678877 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-gxh7z" podUID="414280b4-3299-4fee-a33c-a231b66000c7" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 17 09:08:07 crc kubenswrapper[4848]: I0217 09:08:07.678241 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-gxh7z" Feb 17 09:08:07 crc kubenswrapper[4848]: I0217 09:08:07.680752 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-gxh7z" Feb 17 09:08:10 crc kubenswrapper[4848]: I0217 09:08:10.037152 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs\") pod \"network-metrics-daemon-78r6x\" (UID: \"98bfddd8-4a1a-4b90-973a-adb75b02fdba\") " pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:08:10 crc kubenswrapper[4848]: I0217 09:08:10.049410 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/98bfddd8-4a1a-4b90-973a-adb75b02fdba-metrics-certs\") pod \"network-metrics-daemon-78r6x\" (UID: \"98bfddd8-4a1a-4b90-973a-adb75b02fdba\") " pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:08:10 crc kubenswrapper[4848]: I0217 09:08:10.099124 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-78r6x" Feb 17 09:08:10 crc kubenswrapper[4848]: I0217 09:08:10.872346 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6xbj4"] Feb 17 09:08:10 crc kubenswrapper[4848]: I0217 09:08:10.872634 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" podUID="b0fba000-75c8-49be-945e-fc41fabf370c" containerName="controller-manager" containerID="cri-o://d269676e69fa6a3838213df4271ed33dd4a8ee51bb86929f8bee3c85566e988a" gracePeriod=30 Feb 17 09:08:10 crc kubenswrapper[4848]: I0217 09:08:10.899547 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg"] Feb 17 09:08:10 crc kubenswrapper[4848]: I0217 09:08:10.899902 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" podUID="2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a" containerName="route-controller-manager" containerID="cri-o://3db917ae5c371beef36fa9f850d541ccaa54d93d792ac38565efdc62c7cda6e8" gracePeriod=30 Feb 17 09:08:11 crc kubenswrapper[4848]: I0217 09:08:11.351648 4848 generic.go:334] "Generic (PLEG): container finished" podID="2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a" containerID="3db917ae5c371beef36fa9f850d541ccaa54d93d792ac38565efdc62c7cda6e8" exitCode=0 Feb 17 09:08:11 crc kubenswrapper[4848]: I0217 09:08:11.351743 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" event={"ID":"2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a","Type":"ContainerDied","Data":"3db917ae5c371beef36fa9f850d541ccaa54d93d792ac38565efdc62c7cda6e8"} Feb 17 09:08:11 crc kubenswrapper[4848]: I0217 09:08:11.354330 4848 generic.go:334] "Generic (PLEG): container finished" podID="b0fba000-75c8-49be-945e-fc41fabf370c" containerID="d269676e69fa6a3838213df4271ed33dd4a8ee51bb86929f8bee3c85566e988a" exitCode=0 Feb 17 09:08:11 crc kubenswrapper[4848]: I0217 09:08:11.354383 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" event={"ID":"b0fba000-75c8-49be-945e-fc41fabf370c","Type":"ContainerDied","Data":"d269676e69fa6a3838213df4271ed33dd4a8ee51bb86929f8bee3c85566e988a"} Feb 17 09:08:12 crc kubenswrapper[4848]: I0217 09:08:12.872715 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:08:14 crc kubenswrapper[4848]: I0217 09:08:14.545057 4848 patch_prober.go:28] interesting pod/downloads-7954f5f757-7vmx8 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 17 09:08:14 crc kubenswrapper[4848]: I0217 09:08:14.545061 4848 patch_prober.go:28] interesting pod/downloads-7954f5f757-7vmx8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 17 09:08:14 crc kubenswrapper[4848]: I0217 09:08:14.545389 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7vmx8" podUID="575767dd-6121-4745-aae9-c5434aee72d5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 17 09:08:14 crc kubenswrapper[4848]: I0217 09:08:14.545331 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-7vmx8" podUID="575767dd-6121-4745-aae9-c5434aee72d5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 17 09:08:14 crc kubenswrapper[4848]: I0217 09:08:14.545476 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-7vmx8" Feb 17 09:08:14 crc kubenswrapper[4848]: I0217 09:08:14.546003 4848 patch_prober.go:28] interesting pod/downloads-7954f5f757-7vmx8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 17 09:08:14 crc kubenswrapper[4848]: I0217 09:08:14.546025 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7vmx8" podUID="575767dd-6121-4745-aae9-c5434aee72d5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 17 09:08:14 crc kubenswrapper[4848]: I0217 09:08:14.546142 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"94fb868e1b5349877f8c886c9791517d9e47085d99e98534df4f3a99ec1d0e90"} pod="openshift-console/downloads-7954f5f757-7vmx8" containerMessage="Container download-server failed liveness probe, will be restarted" Feb 17 09:08:14 crc kubenswrapper[4848]: I0217 09:08:14.546237 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-7vmx8" podUID="575767dd-6121-4745-aae9-c5434aee72d5" containerName="download-server" containerID="cri-o://94fb868e1b5349877f8c886c9791517d9e47085d99e98534df4f3a99ec1d0e90" gracePeriod=2 Feb 17 09:08:14 crc kubenswrapper[4848]: I0217 09:08:14.581175 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:08:14 crc kubenswrapper[4848]: I0217 09:08:14.588416 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:08:15 crc kubenswrapper[4848]: I0217 09:08:15.385823 4848 generic.go:334] "Generic (PLEG): container finished" podID="575767dd-6121-4745-aae9-c5434aee72d5" containerID="94fb868e1b5349877f8c886c9791517d9e47085d99e98534df4f3a99ec1d0e90" exitCode=0 Feb 17 09:08:15 crc kubenswrapper[4848]: I0217 09:08:15.389171 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-7vmx8" event={"ID":"575767dd-6121-4745-aae9-c5434aee72d5","Type":"ContainerDied","Data":"94fb868e1b5349877f8c886c9791517d9e47085d99e98534df4f3a99ec1d0e90"} Feb 17 09:08:15 crc kubenswrapper[4848]: I0217 09:08:15.462434 4848 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-lwfvg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 09:08:15 crc kubenswrapper[4848]: I0217 09:08:15.462543 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" podUID="2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 09:08:15 crc kubenswrapper[4848]: I0217 09:08:15.969309 4848 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6xbj4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Feb 17 09:08:15 crc kubenswrapper[4848]: I0217 09:08:15.969389 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" podUID="b0fba000-75c8-49be-945e-fc41fabf370c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 17 09:08:18 crc kubenswrapper[4848]: I0217 09:08:18.772215 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:08:18 crc kubenswrapper[4848]: I0217 09:08:18.772695 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.365593 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.366802 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" Feb 17 09:08:23 crc kubenswrapper[4848]: E0217 09:08:23.413729 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 17 09:08:23 crc kubenswrapper[4848]: E0217 09:08:23.413890 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rgrc9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-q8fsn_openshift-marketplace(7c90e73c-c31d-4c69-a555-f191b15f8cb7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 17 09:08:23 crc kubenswrapper[4848]: E0217 09:08:23.415900 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-q8fsn" podUID="7c90e73c-c31d-4c69-a555-f191b15f8cb7" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.436451 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a-client-ca\") pod \"2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a\" (UID: \"2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a\") " Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.436539 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b0fba000-75c8-49be-945e-fc41fabf370c-proxy-ca-bundles\") pod \"b0fba000-75c8-49be-945e-fc41fabf370c\" (UID: \"b0fba000-75c8-49be-945e-fc41fabf370c\") " Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.436568 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a-serving-cert\") pod \"2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a\" (UID: \"2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a\") " Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.436585 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a-config\") pod \"2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a\" (UID: \"2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a\") " Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.436615 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcbb8\" (UniqueName: \"kubernetes.io/projected/2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a-kube-api-access-vcbb8\") pod \"2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a\" (UID: \"2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a\") " Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.436632 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgbwd\" (UniqueName: \"kubernetes.io/projected/b0fba000-75c8-49be-945e-fc41fabf370c-kube-api-access-fgbwd\") pod \"b0fba000-75c8-49be-945e-fc41fabf370c\" (UID: \"b0fba000-75c8-49be-945e-fc41fabf370c\") " Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.436665 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b0fba000-75c8-49be-945e-fc41fabf370c-client-ca\") pod \"b0fba000-75c8-49be-945e-fc41fabf370c\" (UID: \"b0fba000-75c8-49be-945e-fc41fabf370c\") " Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.436694 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0fba000-75c8-49be-945e-fc41fabf370c-serving-cert\") pod \"b0fba000-75c8-49be-945e-fc41fabf370c\" (UID: \"b0fba000-75c8-49be-945e-fc41fabf370c\") " Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.436711 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0fba000-75c8-49be-945e-fc41fabf370c-config\") pod \"b0fba000-75c8-49be-945e-fc41fabf370c\" (UID: \"b0fba000-75c8-49be-945e-fc41fabf370c\") " Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.438217 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a-config" (OuterVolumeSpecName: "config") pod "2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a" (UID: "2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.439214 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-844596c887-vrvtx"] Feb 17 09:08:23 crc kubenswrapper[4848]: E0217 09:08:23.439417 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a" containerName="route-controller-manager" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.439429 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a" containerName="route-controller-manager" Feb 17 09:08:23 crc kubenswrapper[4848]: E0217 09:08:23.439446 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9d44019-ee7c-43bb-95d5-f7c75445c65b" containerName="pruner" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.439453 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d44019-ee7c-43bb-95d5-f7c75445c65b" containerName="pruner" Feb 17 09:08:23 crc kubenswrapper[4848]: E0217 09:08:23.439465 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac" containerName="pruner" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.439472 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac" containerName="pruner" Feb 17 09:08:23 crc kubenswrapper[4848]: E0217 09:08:23.439481 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0fba000-75c8-49be-945e-fc41fabf370c" containerName="controller-manager" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.439487 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0fba000-75c8-49be-945e-fc41fabf370c" containerName="controller-manager" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.439570 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a" containerName="route-controller-manager" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.439579 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9d44019-ee7c-43bb-95d5-f7c75445c65b" containerName="pruner" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.439591 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dc8bdc8-a6b9-4e53-8dc6-45d5614fbbac" containerName="pruner" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.439603 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0fba000-75c8-49be-945e-fc41fabf370c" containerName="controller-manager" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.440010 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-844596c887-vrvtx"] Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.440085 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.440153 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a-client-ca" (OuterVolumeSpecName: "client-ca") pod "2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a" (UID: "2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.449607 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0fba000-75c8-49be-945e-fc41fabf370c-client-ca" (OuterVolumeSpecName: "client-ca") pod "b0fba000-75c8-49be-945e-fc41fabf370c" (UID: "b0fba000-75c8-49be-945e-fc41fabf370c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.457704 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0fba000-75c8-49be-945e-fc41fabf370c-kube-api-access-fgbwd" (OuterVolumeSpecName: "kube-api-access-fgbwd") pod "b0fba000-75c8-49be-945e-fc41fabf370c" (UID: "b0fba000-75c8-49be-945e-fc41fabf370c"). InnerVolumeSpecName "kube-api-access-fgbwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.459703 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0fba000-75c8-49be-945e-fc41fabf370c-config" (OuterVolumeSpecName: "config") pod "b0fba000-75c8-49be-945e-fc41fabf370c" (UID: "b0fba000-75c8-49be-945e-fc41fabf370c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.461853 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0fba000-75c8-49be-945e-fc41fabf370c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b0fba000-75c8-49be-945e-fc41fabf370c" (UID: "b0fba000-75c8-49be-945e-fc41fabf370c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.467867 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a" (UID: "2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.472953 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" event={"ID":"2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a","Type":"ContainerDied","Data":"52e2f74afbe0914ea80f4fc0b8fed78194bf90a8f3fb5c6243921f6b3025c569"} Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.473003 4848 scope.go:117] "RemoveContainer" containerID="3db917ae5c371beef36fa9f850d541ccaa54d93d792ac38565efdc62c7cda6e8" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.473152 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.479017 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a-kube-api-access-vcbb8" (OuterVolumeSpecName: "kube-api-access-vcbb8") pod "2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a" (UID: "2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a"). InnerVolumeSpecName "kube-api-access-vcbb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.483234 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0fba000-75c8-49be-945e-fc41fabf370c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b0fba000-75c8-49be-945e-fc41fabf370c" (UID: "b0fba000-75c8-49be-945e-fc41fabf370c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.493722 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.493797 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6xbj4" event={"ID":"b0fba000-75c8-49be-945e-fc41fabf370c","Type":"ContainerDied","Data":"6841ae4e6017a1b7f9d48345986b89fd5be59f5ce2811b73e8235225e1b3c13c"} Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.511901 4848 scope.go:117] "RemoveContainer" containerID="d269676e69fa6a3838213df4271ed33dd4a8ee51bb86929f8bee3c85566e988a" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.538284 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-config\") pod \"controller-manager-844596c887-vrvtx\" (UID: \"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2\") " pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.538594 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zgqc\" (UniqueName: \"kubernetes.io/projected/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-kube-api-access-7zgqc\") pod \"controller-manager-844596c887-vrvtx\" (UID: \"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2\") " pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.538624 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-client-ca\") pod \"controller-manager-844596c887-vrvtx\" (UID: \"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2\") " pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.538644 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-serving-cert\") pod \"controller-manager-844596c887-vrvtx\" (UID: \"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2\") " pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.538687 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-proxy-ca-bundles\") pod \"controller-manager-844596c887-vrvtx\" (UID: \"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2\") " pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.538759 4848 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b0fba000-75c8-49be-945e-fc41fabf370c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.538772 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.538793 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.538802 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcbb8\" (UniqueName: \"kubernetes.io/projected/2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a-kube-api-access-vcbb8\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.538811 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgbwd\" (UniqueName: \"kubernetes.io/projected/b0fba000-75c8-49be-945e-fc41fabf370c-kube-api-access-fgbwd\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.538820 4848 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b0fba000-75c8-49be-945e-fc41fabf370c-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.538828 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0fba000-75c8-49be-945e-fc41fabf370c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.538836 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0fba000-75c8-49be-945e-fc41fabf370c-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.538844 4848 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:23 crc kubenswrapper[4848]: E0217 09:08:23.542201 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-q8fsn" podUID="7c90e73c-c31d-4c69-a555-f191b15f8cb7" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.556542 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6xbj4"] Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.559267 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6xbj4"] Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.615754 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-78r6x"] Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.639770 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-config\") pod \"controller-manager-844596c887-vrvtx\" (UID: \"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2\") " pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.639814 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zgqc\" (UniqueName: \"kubernetes.io/projected/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-kube-api-access-7zgqc\") pod \"controller-manager-844596c887-vrvtx\" (UID: \"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2\") " pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.639841 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-client-ca\") pod \"controller-manager-844596c887-vrvtx\" (UID: \"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2\") " pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.639867 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-serving-cert\") pod \"controller-manager-844596c887-vrvtx\" (UID: \"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2\") " pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.639901 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-proxy-ca-bundles\") pod \"controller-manager-844596c887-vrvtx\" (UID: \"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2\") " pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.641311 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-proxy-ca-bundles\") pod \"controller-manager-844596c887-vrvtx\" (UID: \"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2\") " pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.641316 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-config\") pod \"controller-manager-844596c887-vrvtx\" (UID: \"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2\") " pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.641900 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-client-ca\") pod \"controller-manager-844596c887-vrvtx\" (UID: \"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2\") " pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.646780 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-serving-cert\") pod \"controller-manager-844596c887-vrvtx\" (UID: \"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2\") " pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.659121 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zgqc\" (UniqueName: \"kubernetes.io/projected/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-kube-api-access-7zgqc\") pod \"controller-manager-844596c887-vrvtx\" (UID: \"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2\") " pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.787687 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.811931 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg"] Feb 17 09:08:23 crc kubenswrapper[4848]: I0217 09:08:23.815624 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lwfvg"] Feb 17 09:08:24 crc kubenswrapper[4848]: I0217 09:08:24.108252 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-844596c887-vrvtx"] Feb 17 09:08:24 crc kubenswrapper[4848]: W0217 09:08:24.226065 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40b8048f_3ea0_48dc_95c8_4a96ffdb14a2.slice/crio-e9d643c099c3964aa5ec80060331250fbc31d578e01336797056596159a5db5b WatchSource:0}: Error finding container e9d643c099c3964aa5ec80060331250fbc31d578e01336797056596159a5db5b: Status 404 returned error can't find the container with id e9d643c099c3964aa5ec80060331250fbc31d578e01336797056596159a5db5b Feb 17 09:08:24 crc kubenswrapper[4848]: I0217 09:08:24.508530 4848 generic.go:334] "Generic (PLEG): container finished" podID="38529e93-75d3-4b08-a3dc-939fab0cbf66" containerID="cf987ef05be8d4c0f9e3ff66a958ba259bcb73a5c640f68afc9026b1bdb17c17" exitCode=0 Feb 17 09:08:24 crc kubenswrapper[4848]: I0217 09:08:24.508580 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rb4g" event={"ID":"38529e93-75d3-4b08-a3dc-939fab0cbf66","Type":"ContainerDied","Data":"cf987ef05be8d4c0f9e3ff66a958ba259bcb73a5c640f68afc9026b1bdb17c17"} Feb 17 09:08:24 crc kubenswrapper[4848]: I0217 09:08:24.524572 4848 generic.go:334] "Generic (PLEG): container finished" podID="3f9aa20a-a818-4f1e-a1e0-345ea27c1832" containerID="fd6ea090bd5c207540d8068a4d1141c9e5bedc19c6432f19fa315d9f7a59483e" exitCode=0 Feb 17 09:08:24 crc kubenswrapper[4848]: I0217 09:08:24.524942 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v548h" event={"ID":"3f9aa20a-a818-4f1e-a1e0-345ea27c1832","Type":"ContainerDied","Data":"fd6ea090bd5c207540d8068a4d1141c9e5bedc19c6432f19fa315d9f7a59483e"} Feb 17 09:08:24 crc kubenswrapper[4848]: I0217 09:08:24.528221 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdbtg" event={"ID":"a3c5c067-a917-4f03-9cd1-6c5b3c1d768a","Type":"ContainerStarted","Data":"8f09ef8a3481ce7b999cd972f2f5cdc10575b2629ba5c27f43a4e1d5491a1577"} Feb 17 09:08:24 crc kubenswrapper[4848]: I0217 09:08:24.534204 4848 generic.go:334] "Generic (PLEG): container finished" podID="f224f79c-f6d1-442a-be23-4fc8e7527d3a" containerID="9cd61d8f24fd187fbe89d4894eccc26dcf241c909879449d0d181ef6389f5352" exitCode=0 Feb 17 09:08:24 crc kubenswrapper[4848]: I0217 09:08:24.534319 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9rm9" event={"ID":"f224f79c-f6d1-442a-be23-4fc8e7527d3a","Type":"ContainerDied","Data":"9cd61d8f24fd187fbe89d4894eccc26dcf241c909879449d0d181ef6389f5352"} Feb 17 09:08:24 crc kubenswrapper[4848]: I0217 09:08:24.541367 4848 generic.go:334] "Generic (PLEG): container finished" podID="9547ed7d-6e19-4a09-84f1-8afaae314251" containerID="f5335f747c639d60688c4ca2c39ca23ad7bb34840672fb5eaf8bac4f53190c15" exitCode=0 Feb 17 09:08:24 crc kubenswrapper[4848]: I0217 09:08:24.541481 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7pln" event={"ID":"9547ed7d-6e19-4a09-84f1-8afaae314251","Type":"ContainerDied","Data":"f5335f747c639d60688c4ca2c39ca23ad7bb34840672fb5eaf8bac4f53190c15"} Feb 17 09:08:24 crc kubenswrapper[4848]: I0217 09:08:24.543232 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" event={"ID":"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2","Type":"ContainerStarted","Data":"e9d643c099c3964aa5ec80060331250fbc31d578e01336797056596159a5db5b"} Feb 17 09:08:24 crc kubenswrapper[4848]: I0217 09:08:24.545455 4848 generic.go:334] "Generic (PLEG): container finished" podID="b3a74de7-62f1-46c2-b518-6baa8b222b1b" containerID="ec95d1591dc1890ddce80c4050574469fa8ba0f01b46c3320dad3b29f8da4f1f" exitCode=0 Feb 17 09:08:24 crc kubenswrapper[4848]: I0217 09:08:24.545546 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45twr" event={"ID":"b3a74de7-62f1-46c2-b518-6baa8b222b1b","Type":"ContainerDied","Data":"ec95d1591dc1890ddce80c4050574469fa8ba0f01b46c3320dad3b29f8da4f1f"} Feb 17 09:08:24 crc kubenswrapper[4848]: I0217 09:08:24.545664 4848 patch_prober.go:28] interesting pod/downloads-7954f5f757-7vmx8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 17 09:08:24 crc kubenswrapper[4848]: I0217 09:08:24.545728 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7vmx8" podUID="575767dd-6121-4745-aae9-c5434aee72d5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 17 09:08:24 crc kubenswrapper[4848]: I0217 09:08:24.554409 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-7vmx8" event={"ID":"575767dd-6121-4745-aae9-c5434aee72d5","Type":"ContainerStarted","Data":"7710159f0a03e4ade6663c0f91f9f0b1d6b5989f382a563c0cead3bf71758e61"} Feb 17 09:08:24 crc kubenswrapper[4848]: I0217 09:08:24.557339 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-7vmx8" Feb 17 09:08:24 crc kubenswrapper[4848]: I0217 09:08:24.557458 4848 patch_prober.go:28] interesting pod/downloads-7954f5f757-7vmx8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 17 09:08:24 crc kubenswrapper[4848]: I0217 09:08:24.557490 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7vmx8" podUID="575767dd-6121-4745-aae9-c5434aee72d5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 17 09:08:24 crc kubenswrapper[4848]: I0217 09:08:24.558848 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-78r6x" event={"ID":"98bfddd8-4a1a-4b90-973a-adb75b02fdba","Type":"ContainerStarted","Data":"37a488d3d05517401800d8c6baa83d1414d9c7ad515efaf4d20ee7413ca79154"} Feb 17 09:08:24 crc kubenswrapper[4848]: I0217 09:08:24.560279 4848 generic.go:334] "Generic (PLEG): container finished" podID="09ac4713-0dcb-4908-8063-b6e029c132d7" containerID="1fcc8d53f6d7179b3f218d4dbd02c3023bedde71cb14bda3a94a3f5729df28ce" exitCode=0 Feb 17 09:08:24 crc kubenswrapper[4848]: I0217 09:08:24.560309 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wt92" event={"ID":"09ac4713-0dcb-4908-8063-b6e029c132d7","Type":"ContainerDied","Data":"1fcc8d53f6d7179b3f218d4dbd02c3023bedde71cb14bda3a94a3f5729df28ce"} Feb 17 09:08:24 crc kubenswrapper[4848]: E0217 09:08:24.710450 4848 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3c5c067_a917_4f03_9cd1_6c5b3c1d768a.slice/crio-conmon-8f09ef8a3481ce7b999cd972f2f5cdc10575b2629ba5c27f43a4e1d5491a1577.scope\": RecentStats: unable to find data in memory cache]" Feb 17 09:08:25 crc kubenswrapper[4848]: I0217 09:08:25.395040 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a" path="/var/lib/kubelet/pods/2bcea8cf-9a8e-42a8-ae56-b915ec6dc58a/volumes" Feb 17 09:08:25 crc kubenswrapper[4848]: I0217 09:08:25.396268 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0fba000-75c8-49be-945e-fc41fabf370c" path="/var/lib/kubelet/pods/b0fba000-75c8-49be-945e-fc41fabf370c/volumes" Feb 17 09:08:25 crc kubenswrapper[4848]: I0217 09:08:25.567173 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" event={"ID":"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2","Type":"ContainerStarted","Data":"8dc4bd515b6d87da583dfa1e5d670fbfd76e89678ca795c7db1220c0d49b5e6b"} Feb 17 09:08:25 crc kubenswrapper[4848]: I0217 09:08:25.567636 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" Feb 17 09:08:25 crc kubenswrapper[4848]: I0217 09:08:25.569520 4848 generic.go:334] "Generic (PLEG): container finished" podID="a3c5c067-a917-4f03-9cd1-6c5b3c1d768a" containerID="8f09ef8a3481ce7b999cd972f2f5cdc10575b2629ba5c27f43a4e1d5491a1577" exitCode=0 Feb 17 09:08:25 crc kubenswrapper[4848]: I0217 09:08:25.569565 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdbtg" event={"ID":"a3c5c067-a917-4f03-9cd1-6c5b3c1d768a","Type":"ContainerDied","Data":"8f09ef8a3481ce7b999cd972f2f5cdc10575b2629ba5c27f43a4e1d5491a1577"} Feb 17 09:08:25 crc kubenswrapper[4848]: I0217 09:08:25.575540 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-78r6x" event={"ID":"98bfddd8-4a1a-4b90-973a-adb75b02fdba","Type":"ContainerStarted","Data":"27fb8e5c50ecde3b48f8b030d82fdfa6fc89432777b2724a227824bf60bf7400"} Feb 17 09:08:25 crc kubenswrapper[4848]: I0217 09:08:25.575590 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-78r6x" event={"ID":"98bfddd8-4a1a-4b90-973a-adb75b02fdba","Type":"ContainerStarted","Data":"751834a91c9a1dd494decb25d4f0bad67fd51595d21f4ad2d4d64271bf751491"} Feb 17 09:08:25 crc kubenswrapper[4848]: I0217 09:08:25.575701 4848 patch_prober.go:28] interesting pod/downloads-7954f5f757-7vmx8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 17 09:08:25 crc kubenswrapper[4848]: I0217 09:08:25.575743 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7vmx8" podUID="575767dd-6121-4745-aae9-c5434aee72d5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 17 09:08:25 crc kubenswrapper[4848]: I0217 09:08:25.577370 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" Feb 17 09:08:25 crc kubenswrapper[4848]: I0217 09:08:25.593336 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" podStartSLOduration=15.593308674 podStartE2EDuration="15.593308674s" podCreationTimestamp="2026-02-17 09:08:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:08:25.590037144 +0000 UTC m=+183.133292880" watchObservedRunningTime="2026-02-17 09:08:25.593308674 +0000 UTC m=+183.136564360" Feb 17 09:08:25 crc kubenswrapper[4848]: I0217 09:08:25.651365 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-78r6x" podStartSLOduration=158.651343661 podStartE2EDuration="2m38.651343661s" podCreationTimestamp="2026-02-17 09:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:08:25.628176417 +0000 UTC m=+183.171432103" watchObservedRunningTime="2026-02-17 09:08:25.651343661 +0000 UTC m=+183.194599307" Feb 17 09:08:25 crc kubenswrapper[4848]: I0217 09:08:25.717590 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-djpdz" Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.060438 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2"] Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.061208 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2" Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.063873 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.064329 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.064356 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.064582 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.064687 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.064726 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.077918 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2"] Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.177141 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hvsj\" (UniqueName: \"kubernetes.io/projected/71987f3f-8a3b-42b6-a6df-a90f07969caf-kube-api-access-9hvsj\") pod \"route-controller-manager-6867996bd5-5d4d2\" (UID: \"71987f3f-8a3b-42b6-a6df-a90f07969caf\") " pod="openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2" Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.177260 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71987f3f-8a3b-42b6-a6df-a90f07969caf-config\") pod \"route-controller-manager-6867996bd5-5d4d2\" (UID: \"71987f3f-8a3b-42b6-a6df-a90f07969caf\") " pod="openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2" Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.177354 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71987f3f-8a3b-42b6-a6df-a90f07969caf-serving-cert\") pod \"route-controller-manager-6867996bd5-5d4d2\" (UID: \"71987f3f-8a3b-42b6-a6df-a90f07969caf\") " pod="openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2" Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.177409 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71987f3f-8a3b-42b6-a6df-a90f07969caf-client-ca\") pod \"route-controller-manager-6867996bd5-5d4d2\" (UID: \"71987f3f-8a3b-42b6-a6df-a90f07969caf\") " pod="openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2" Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.279044 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71987f3f-8a3b-42b6-a6df-a90f07969caf-config\") pod \"route-controller-manager-6867996bd5-5d4d2\" (UID: \"71987f3f-8a3b-42b6-a6df-a90f07969caf\") " pod="openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2" Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.279200 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71987f3f-8a3b-42b6-a6df-a90f07969caf-serving-cert\") pod \"route-controller-manager-6867996bd5-5d4d2\" (UID: \"71987f3f-8a3b-42b6-a6df-a90f07969caf\") " pod="openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2" Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.280385 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71987f3f-8a3b-42b6-a6df-a90f07969caf-client-ca\") pod \"route-controller-manager-6867996bd5-5d4d2\" (UID: \"71987f3f-8a3b-42b6-a6df-a90f07969caf\") " pod="openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2" Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.280552 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hvsj\" (UniqueName: \"kubernetes.io/projected/71987f3f-8a3b-42b6-a6df-a90f07969caf-kube-api-access-9hvsj\") pod \"route-controller-manager-6867996bd5-5d4d2\" (UID: \"71987f3f-8a3b-42b6-a6df-a90f07969caf\") " pod="openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2" Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.288102 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71987f3f-8a3b-42b6-a6df-a90f07969caf-serving-cert\") pod \"route-controller-manager-6867996bd5-5d4d2\" (UID: \"71987f3f-8a3b-42b6-a6df-a90f07969caf\") " pod="openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2" Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.302120 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71987f3f-8a3b-42b6-a6df-a90f07969caf-client-ca\") pod \"route-controller-manager-6867996bd5-5d4d2\" (UID: \"71987f3f-8a3b-42b6-a6df-a90f07969caf\") " pod="openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2" Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.304058 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71987f3f-8a3b-42b6-a6df-a90f07969caf-config\") pod \"route-controller-manager-6867996bd5-5d4d2\" (UID: \"71987f3f-8a3b-42b6-a6df-a90f07969caf\") " pod="openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2" Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.313552 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hvsj\" (UniqueName: \"kubernetes.io/projected/71987f3f-8a3b-42b6-a6df-a90f07969caf-kube-api-access-9hvsj\") pod \"route-controller-manager-6867996bd5-5d4d2\" (UID: \"71987f3f-8a3b-42b6-a6df-a90f07969caf\") " pod="openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2" Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.375591 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2" Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.584375 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9rm9" event={"ID":"f224f79c-f6d1-442a-be23-4fc8e7527d3a","Type":"ContainerStarted","Data":"016c29e3e70c33bfad969b7a77460d8ed18f621e0c1997b2e20c3e1d1517a198"} Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.585583 4848 patch_prober.go:28] interesting pod/downloads-7954f5f757-7vmx8 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.585625 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-7vmx8" podUID="575767dd-6121-4745-aae9-c5434aee72d5" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Feb 17 09:08:26 crc kubenswrapper[4848]: I0217 09:08:26.602180 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j9rm9" podStartSLOduration=2.885401779 podStartE2EDuration="33.60215782s" podCreationTimestamp="2026-02-17 09:07:53 +0000 UTC" firstStartedPulling="2026-02-17 09:07:55.06001044 +0000 UTC m=+152.603266086" lastFinishedPulling="2026-02-17 09:08:25.776766491 +0000 UTC m=+183.320022127" observedRunningTime="2026-02-17 09:08:26.601290017 +0000 UTC m=+184.144545673" watchObservedRunningTime="2026-02-17 09:08:26.60215782 +0000 UTC m=+184.145413476" Feb 17 09:08:27 crc kubenswrapper[4848]: I0217 09:08:27.819572 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2"] Feb 17 09:08:28 crc kubenswrapper[4848]: I0217 09:08:28.601733 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2" event={"ID":"71987f3f-8a3b-42b6-a6df-a90f07969caf","Type":"ContainerStarted","Data":"30de5b1dd6d80def24a2556dfcd31f987b683e6a1343078da661c937295398a1"} Feb 17 09:08:28 crc kubenswrapper[4848]: I0217 09:08:28.604987 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rb4g" event={"ID":"38529e93-75d3-4b08-a3dc-939fab0cbf66","Type":"ContainerStarted","Data":"5fba36ff660c78451338f51cde4a26f30f2655f77c1f5f7663b7388d93c29793"} Feb 17 09:08:28 crc kubenswrapper[4848]: I0217 09:08:28.621807 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8rb4g" podStartSLOduration=4.252968596 podStartE2EDuration="37.621785365s" podCreationTimestamp="2026-02-17 09:07:51 +0000 UTC" firstStartedPulling="2026-02-17 09:07:53.935854921 +0000 UTC m=+151.479110567" lastFinishedPulling="2026-02-17 09:08:27.30467167 +0000 UTC m=+184.847927336" observedRunningTime="2026-02-17 09:08:28.62123491 +0000 UTC m=+186.164490586" watchObservedRunningTime="2026-02-17 09:08:28.621785365 +0000 UTC m=+186.165041021" Feb 17 09:08:29 crc kubenswrapper[4848]: I0217 09:08:29.613065 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2" event={"ID":"71987f3f-8a3b-42b6-a6df-a90f07969caf","Type":"ContainerStarted","Data":"46b59167fcae4949e2d5af94c82ed35d73450900a79abfb7b0d424e41151ad29"} Feb 17 09:08:29 crc kubenswrapper[4848]: I0217 09:08:29.615251 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v548h" event={"ID":"3f9aa20a-a818-4f1e-a1e0-345ea27c1832","Type":"ContainerStarted","Data":"9ba54898266cfae6cdab679777686365bed2f7cb06aa1f81e7c73a4faddc57e3"} Feb 17 09:08:30 crc kubenswrapper[4848]: I0217 09:08:30.620877 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2" Feb 17 09:08:30 crc kubenswrapper[4848]: I0217 09:08:30.629214 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2" Feb 17 09:08:30 crc kubenswrapper[4848]: I0217 09:08:30.642429 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2" podStartSLOduration=20.642409087 podStartE2EDuration="20.642409087s" podCreationTimestamp="2026-02-17 09:08:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:08:30.639667442 +0000 UTC m=+188.182923078" watchObservedRunningTime="2026-02-17 09:08:30.642409087 +0000 UTC m=+188.185664733" Feb 17 09:08:30 crc kubenswrapper[4848]: I0217 09:08:30.663086 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-v548h" podStartSLOduration=4.949743819 podStartE2EDuration="39.663057212s" podCreationTimestamp="2026-02-17 09:07:51 +0000 UTC" firstStartedPulling="2026-02-17 09:07:53.993349993 +0000 UTC m=+151.536605639" lastFinishedPulling="2026-02-17 09:08:28.706663376 +0000 UTC m=+186.249919032" observedRunningTime="2026-02-17 09:08:30.660223684 +0000 UTC m=+188.203479340" watchObservedRunningTime="2026-02-17 09:08:30.663057212 +0000 UTC m=+188.206312858" Feb 17 09:08:30 crc kubenswrapper[4848]: I0217 09:08:30.781270 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-844596c887-vrvtx"] Feb 17 09:08:30 crc kubenswrapper[4848]: I0217 09:08:30.781470 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" podUID="40b8048f-3ea0-48dc-95c8-4a96ffdb14a2" containerName="controller-manager" containerID="cri-o://8dc4bd515b6d87da583dfa1e5d670fbfd76e89678ca795c7db1220c0d49b5e6b" gracePeriod=30 Feb 17 09:08:30 crc kubenswrapper[4848]: I0217 09:08:30.877377 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2"] Feb 17 09:08:31 crc kubenswrapper[4848]: I0217 09:08:31.019708 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xzdww"] Feb 17 09:08:31 crc kubenswrapper[4848]: I0217 09:08:31.485829 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 17 09:08:31 crc kubenswrapper[4848]: I0217 09:08:31.627942 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7pln" event={"ID":"9547ed7d-6e19-4a09-84f1-8afaae314251","Type":"ContainerStarted","Data":"41a16cb93ae7b9d7e166b6067856ae2fd15d79feac67dc152ad44c6c853c10dd"} Feb 17 09:08:31 crc kubenswrapper[4848]: I0217 09:08:31.633861 4848 generic.go:334] "Generic (PLEG): container finished" podID="40b8048f-3ea0-48dc-95c8-4a96ffdb14a2" containerID="8dc4bd515b6d87da583dfa1e5d670fbfd76e89678ca795c7db1220c0d49b5e6b" exitCode=0 Feb 17 09:08:31 crc kubenswrapper[4848]: I0217 09:08:31.634378 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" event={"ID":"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2","Type":"ContainerDied","Data":"8dc4bd515b6d87da583dfa1e5d670fbfd76e89678ca795c7db1220c0d49b5e6b"} Feb 17 09:08:31 crc kubenswrapper[4848]: I0217 09:08:31.656083 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h7pln" podStartSLOduration=4.274333275 podStartE2EDuration="39.656068715s" podCreationTimestamp="2026-02-17 09:07:52 +0000 UTC" firstStartedPulling="2026-02-17 09:07:55.076137111 +0000 UTC m=+152.619392747" lastFinishedPulling="2026-02-17 09:08:30.457872541 +0000 UTC m=+188.001128187" observedRunningTime="2026-02-17 09:08:31.654004749 +0000 UTC m=+189.197260395" watchObservedRunningTime="2026-02-17 09:08:31.656068715 +0000 UTC m=+189.199324361" Feb 17 09:08:31 crc kubenswrapper[4848]: I0217 09:08:31.880976 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8rb4g" Feb 17 09:08:31 crc kubenswrapper[4848]: I0217 09:08:31.881265 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8rb4g" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.077411 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.107478 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k"] Feb 17 09:08:32 crc kubenswrapper[4848]: E0217 09:08:32.107742 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40b8048f-3ea0-48dc-95c8-4a96ffdb14a2" containerName="controller-manager" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.107774 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="40b8048f-3ea0-48dc-95c8-4a96ffdb14a2" containerName="controller-manager" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.107885 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="40b8048f-3ea0-48dc-95c8-4a96ffdb14a2" containerName="controller-manager" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.108334 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.119697 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k"] Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.164218 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-proxy-ca-bundles\") pod \"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2\" (UID: \"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2\") " Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.164272 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zgqc\" (UniqueName: \"kubernetes.io/projected/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-kube-api-access-7zgqc\") pod \"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2\" (UID: \"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2\") " Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.164306 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-config\") pod \"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2\" (UID: \"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2\") " Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.164350 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-client-ca\") pod \"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2\" (UID: \"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2\") " Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.164380 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-serving-cert\") pod \"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2\" (UID: \"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2\") " Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.164539 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f59a781-9159-4243-9b47-33388c25af00-proxy-ca-bundles\") pod \"controller-manager-6748b9c5fd-rqx7k\" (UID: \"2f59a781-9159-4243-9b47-33388c25af00\") " pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.164566 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f59a781-9159-4243-9b47-33388c25af00-serving-cert\") pod \"controller-manager-6748b9c5fd-rqx7k\" (UID: \"2f59a781-9159-4243-9b47-33388c25af00\") " pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.164585 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f59a781-9159-4243-9b47-33388c25af00-config\") pod \"controller-manager-6748b9c5fd-rqx7k\" (UID: \"2f59a781-9159-4243-9b47-33388c25af00\") " pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.164633 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f59a781-9159-4243-9b47-33388c25af00-client-ca\") pod \"controller-manager-6748b9c5fd-rqx7k\" (UID: \"2f59a781-9159-4243-9b47-33388c25af00\") " pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.164663 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsd2d\" (UniqueName: \"kubernetes.io/projected/2f59a781-9159-4243-9b47-33388c25af00-kube-api-access-xsd2d\") pod \"controller-manager-6748b9c5fd-rqx7k\" (UID: \"2f59a781-9159-4243-9b47-33388c25af00\") " pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.165528 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-client-ca" (OuterVolumeSpecName: "client-ca") pod "40b8048f-3ea0-48dc-95c8-4a96ffdb14a2" (UID: "40b8048f-3ea0-48dc-95c8-4a96ffdb14a2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.166105 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-config" (OuterVolumeSpecName: "config") pod "40b8048f-3ea0-48dc-95c8-4a96ffdb14a2" (UID: "40b8048f-3ea0-48dc-95c8-4a96ffdb14a2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.167537 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "40b8048f-3ea0-48dc-95c8-4a96ffdb14a2" (UID: "40b8048f-3ea0-48dc-95c8-4a96ffdb14a2"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.173925 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-kube-api-access-7zgqc" (OuterVolumeSpecName: "kube-api-access-7zgqc") pod "40b8048f-3ea0-48dc-95c8-4a96ffdb14a2" (UID: "40b8048f-3ea0-48dc-95c8-4a96ffdb14a2"). InnerVolumeSpecName "kube-api-access-7zgqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.182945 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "40b8048f-3ea0-48dc-95c8-4a96ffdb14a2" (UID: "40b8048f-3ea0-48dc-95c8-4a96ffdb14a2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.265580 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f59a781-9159-4243-9b47-33388c25af00-proxy-ca-bundles\") pod \"controller-manager-6748b9c5fd-rqx7k\" (UID: \"2f59a781-9159-4243-9b47-33388c25af00\") " pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.265641 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f59a781-9159-4243-9b47-33388c25af00-serving-cert\") pod \"controller-manager-6748b9c5fd-rqx7k\" (UID: \"2f59a781-9159-4243-9b47-33388c25af00\") " pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.265665 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f59a781-9159-4243-9b47-33388c25af00-config\") pod \"controller-manager-6748b9c5fd-rqx7k\" (UID: \"2f59a781-9159-4243-9b47-33388c25af00\") " pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.265693 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f59a781-9159-4243-9b47-33388c25af00-client-ca\") pod \"controller-manager-6748b9c5fd-rqx7k\" (UID: \"2f59a781-9159-4243-9b47-33388c25af00\") " pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.265719 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsd2d\" (UniqueName: \"kubernetes.io/projected/2f59a781-9159-4243-9b47-33388c25af00-kube-api-access-xsd2d\") pod \"controller-manager-6748b9c5fd-rqx7k\" (UID: \"2f59a781-9159-4243-9b47-33388c25af00\") " pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.265948 4848 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.265965 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zgqc\" (UniqueName: \"kubernetes.io/projected/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-kube-api-access-7zgqc\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.265980 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.265994 4848 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.266004 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.266995 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f59a781-9159-4243-9b47-33388c25af00-proxy-ca-bundles\") pod \"controller-manager-6748b9c5fd-rqx7k\" (UID: \"2f59a781-9159-4243-9b47-33388c25af00\") " pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.267646 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f59a781-9159-4243-9b47-33388c25af00-client-ca\") pod \"controller-manager-6748b9c5fd-rqx7k\" (UID: \"2f59a781-9159-4243-9b47-33388c25af00\") " pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.268047 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f59a781-9159-4243-9b47-33388c25af00-config\") pod \"controller-manager-6748b9c5fd-rqx7k\" (UID: \"2f59a781-9159-4243-9b47-33388c25af00\") " pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.271556 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f59a781-9159-4243-9b47-33388c25af00-serving-cert\") pod \"controller-manager-6748b9c5fd-rqx7k\" (UID: \"2f59a781-9159-4243-9b47-33388c25af00\") " pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.276178 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-v548h" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.276240 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-v548h" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.285794 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsd2d\" (UniqueName: \"kubernetes.io/projected/2f59a781-9159-4243-9b47-33388c25af00-kube-api-access-xsd2d\") pod \"controller-manager-6748b9c5fd-rqx7k\" (UID: \"2f59a781-9159-4243-9b47-33388c25af00\") " pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.444962 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.460389 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-v548h" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.464705 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8rb4g" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.495100 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h7pln" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.495161 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h7pln" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.641675 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.645133 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2" podUID="71987f3f-8a3b-42b6-a6df-a90f07969caf" containerName="route-controller-manager" containerID="cri-o://46b59167fcae4949e2d5af94c82ed35d73450900a79abfb7b0d424e41151ad29" gracePeriod=30 Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.645189 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-844596c887-vrvtx" event={"ID":"40b8048f-3ea0-48dc-95c8-4a96ffdb14a2","Type":"ContainerDied","Data":"e9d643c099c3964aa5ec80060331250fbc31d578e01336797056596159a5db5b"} Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.645275 4848 scope.go:117] "RemoveContainer" containerID="8dc4bd515b6d87da583dfa1e5d670fbfd76e89678ca795c7db1220c0d49b5e6b" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.672226 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-844596c887-vrvtx"] Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.677630 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-844596c887-vrvtx"] Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.686724 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8rb4g" Feb 17 09:08:32 crc kubenswrapper[4848]: I0217 09:08:32.984700 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k"] Feb 17 09:08:32 crc kubenswrapper[4848]: W0217 09:08:32.995217 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f59a781_9159_4243_9b47_33388c25af00.slice/crio-04dfe9f350851317949cb4430e28e473dedb915d710e397b2f47ef19a8937cd1 WatchSource:0}: Error finding container 04dfe9f350851317949cb4430e28e473dedb915d710e397b2f47ef19a8937cd1: Status 404 returned error can't find the container with id 04dfe9f350851317949cb4430e28e473dedb915d710e397b2f47ef19a8937cd1 Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.389424 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40b8048f-3ea0-48dc-95c8-4a96ffdb14a2" path="/var/lib/kubelet/pods/40b8048f-3ea0-48dc-95c8-4a96ffdb14a2/volumes" Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.565405 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-h7pln" podUID="9547ed7d-6e19-4a09-84f1-8afaae314251" containerName="registry-server" probeResult="failure" output=< Feb 17 09:08:33 crc kubenswrapper[4848]: timeout: failed to connect service ":50051" within 1s Feb 17 09:08:33 crc kubenswrapper[4848]: > Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.651259 4848 generic.go:334] "Generic (PLEG): container finished" podID="71987f3f-8a3b-42b6-a6df-a90f07969caf" containerID="46b59167fcae4949e2d5af94c82ed35d73450900a79abfb7b0d424e41151ad29" exitCode=0 Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.651345 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2" event={"ID":"71987f3f-8a3b-42b6-a6df-a90f07969caf","Type":"ContainerDied","Data":"46b59167fcae4949e2d5af94c82ed35d73450900a79abfb7b0d424e41151ad29"} Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.658366 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45twr" event={"ID":"b3a74de7-62f1-46c2-b518-6baa8b222b1b","Type":"ContainerStarted","Data":"0dfd6374652a02df264a081a6853759e908e8426ce7b5e274a031b6b96f0c01a"} Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.659922 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" event={"ID":"2f59a781-9159-4243-9b47-33388c25af00","Type":"ContainerStarted","Data":"04dfe9f350851317949cb4430e28e473dedb915d710e397b2f47ef19a8937cd1"} Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.677029 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-45twr" podStartSLOduration=3.890127047 podStartE2EDuration="39.677006055s" podCreationTimestamp="2026-02-17 09:07:54 +0000 UTC" firstStartedPulling="2026-02-17 09:07:56.130923363 +0000 UTC m=+153.674179009" lastFinishedPulling="2026-02-17 09:08:31.917802371 +0000 UTC m=+189.461058017" observedRunningTime="2026-02-17 09:08:33.67644222 +0000 UTC m=+191.219697866" watchObservedRunningTime="2026-02-17 09:08:33.677006055 +0000 UTC m=+191.220261711" Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.780393 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.781391 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.783508 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2" Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.785531 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.785993 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.790186 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.887515 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hvsj\" (UniqueName: \"kubernetes.io/projected/71987f3f-8a3b-42b6-a6df-a90f07969caf-kube-api-access-9hvsj\") pod \"71987f3f-8a3b-42b6-a6df-a90f07969caf\" (UID: \"71987f3f-8a3b-42b6-a6df-a90f07969caf\") " Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.887587 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71987f3f-8a3b-42b6-a6df-a90f07969caf-serving-cert\") pod \"71987f3f-8a3b-42b6-a6df-a90f07969caf\" (UID: \"71987f3f-8a3b-42b6-a6df-a90f07969caf\") " Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.887639 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71987f3f-8a3b-42b6-a6df-a90f07969caf-config\") pod \"71987f3f-8a3b-42b6-a6df-a90f07969caf\" (UID: \"71987f3f-8a3b-42b6-a6df-a90f07969caf\") " Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.887689 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71987f3f-8a3b-42b6-a6df-a90f07969caf-client-ca\") pod \"71987f3f-8a3b-42b6-a6df-a90f07969caf\" (UID: \"71987f3f-8a3b-42b6-a6df-a90f07969caf\") " Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.887874 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47d23900-ea15-4b43-aeb8-4e83ea35a0bc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"47d23900-ea15-4b43-aeb8-4e83ea35a0bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.887924 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47d23900-ea15-4b43-aeb8-4e83ea35a0bc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"47d23900-ea15-4b43-aeb8-4e83ea35a0bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.888352 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71987f3f-8a3b-42b6-a6df-a90f07969caf-client-ca" (OuterVolumeSpecName: "client-ca") pod "71987f3f-8a3b-42b6-a6df-a90f07969caf" (UID: "71987f3f-8a3b-42b6-a6df-a90f07969caf"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.888486 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71987f3f-8a3b-42b6-a6df-a90f07969caf-config" (OuterVolumeSpecName: "config") pod "71987f3f-8a3b-42b6-a6df-a90f07969caf" (UID: "71987f3f-8a3b-42b6-a6df-a90f07969caf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.892581 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71987f3f-8a3b-42b6-a6df-a90f07969caf-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "71987f3f-8a3b-42b6-a6df-a90f07969caf" (UID: "71987f3f-8a3b-42b6-a6df-a90f07969caf"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.892975 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71987f3f-8a3b-42b6-a6df-a90f07969caf-kube-api-access-9hvsj" (OuterVolumeSpecName: "kube-api-access-9hvsj") pod "71987f3f-8a3b-42b6-a6df-a90f07969caf" (UID: "71987f3f-8a3b-42b6-a6df-a90f07969caf"). InnerVolumeSpecName "kube-api-access-9hvsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.988703 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47d23900-ea15-4b43-aeb8-4e83ea35a0bc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"47d23900-ea15-4b43-aeb8-4e83ea35a0bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.988833 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47d23900-ea15-4b43-aeb8-4e83ea35a0bc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"47d23900-ea15-4b43-aeb8-4e83ea35a0bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.988896 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hvsj\" (UniqueName: \"kubernetes.io/projected/71987f3f-8a3b-42b6-a6df-a90f07969caf-kube-api-access-9hvsj\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.988913 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71987f3f-8a3b-42b6-a6df-a90f07969caf-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.988925 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71987f3f-8a3b-42b6-a6df-a90f07969caf-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.988936 4848 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71987f3f-8a3b-42b6-a6df-a90f07969caf-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:33 crc kubenswrapper[4848]: I0217 09:08:33.988962 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47d23900-ea15-4b43-aeb8-4e83ea35a0bc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"47d23900-ea15-4b43-aeb8-4e83ea35a0bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 09:08:34 crc kubenswrapper[4848]: I0217 09:08:34.004992 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47d23900-ea15-4b43-aeb8-4e83ea35a0bc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"47d23900-ea15-4b43-aeb8-4e83ea35a0bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 09:08:34 crc kubenswrapper[4848]: I0217 09:08:34.061311 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j9rm9" Feb 17 09:08:34 crc kubenswrapper[4848]: I0217 09:08:34.061374 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j9rm9" Feb 17 09:08:34 crc kubenswrapper[4848]: I0217 09:08:34.103475 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 09:08:34 crc kubenswrapper[4848]: I0217 09:08:34.115837 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j9rm9" Feb 17 09:08:34 crc kubenswrapper[4848]: I0217 09:08:34.550222 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-7vmx8" Feb 17 09:08:34 crc kubenswrapper[4848]: I0217 09:08:34.665810 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2" event={"ID":"71987f3f-8a3b-42b6-a6df-a90f07969caf","Type":"ContainerDied","Data":"30de5b1dd6d80def24a2556dfcd31f987b683e6a1343078da661c937295398a1"} Feb 17 09:08:34 crc kubenswrapper[4848]: I0217 09:08:34.665854 4848 scope.go:117] "RemoveContainer" containerID="46b59167fcae4949e2d5af94c82ed35d73450900a79abfb7b0d424e41151ad29" Feb 17 09:08:34 crc kubenswrapper[4848]: I0217 09:08:34.666049 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2" Feb 17 09:08:34 crc kubenswrapper[4848]: I0217 09:08:34.705489 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2"] Feb 17 09:08:34 crc kubenswrapper[4848]: I0217 09:08:34.708952 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6867996bd5-5d4d2"] Feb 17 09:08:34 crc kubenswrapper[4848]: I0217 09:08:34.720059 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j9rm9" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.060058 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-45twr" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.060125 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-45twr" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.070320 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-768677949c-26mgx"] Feb 17 09:08:35 crc kubenswrapper[4848]: E0217 09:08:35.070575 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71987f3f-8a3b-42b6-a6df-a90f07969caf" containerName="route-controller-manager" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.070613 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="71987f3f-8a3b-42b6-a6df-a90f07969caf" containerName="route-controller-manager" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.070744 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="71987f3f-8a3b-42b6-a6df-a90f07969caf" containerName="route-controller-manager" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.071154 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-768677949c-26mgx" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.075438 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.077283 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.077417 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.077520 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.078016 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.081335 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.085677 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-768677949c-26mgx"] Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.208369 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a27259-110e-41d2-b8a8-319c784975a6-serving-cert\") pod \"route-controller-manager-768677949c-26mgx\" (UID: \"f1a27259-110e-41d2-b8a8-319c784975a6\") " pod="openshift-route-controller-manager/route-controller-manager-768677949c-26mgx" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.208414 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a27259-110e-41d2-b8a8-319c784975a6-config\") pod \"route-controller-manager-768677949c-26mgx\" (UID: \"f1a27259-110e-41d2-b8a8-319c784975a6\") " pod="openshift-route-controller-manager/route-controller-manager-768677949c-26mgx" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.208458 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1a27259-110e-41d2-b8a8-319c784975a6-client-ca\") pod \"route-controller-manager-768677949c-26mgx\" (UID: \"f1a27259-110e-41d2-b8a8-319c784975a6\") " pod="openshift-route-controller-manager/route-controller-manager-768677949c-26mgx" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.208491 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hjrs\" (UniqueName: \"kubernetes.io/projected/f1a27259-110e-41d2-b8a8-319c784975a6-kube-api-access-5hjrs\") pod \"route-controller-manager-768677949c-26mgx\" (UID: \"f1a27259-110e-41d2-b8a8-319c784975a6\") " pod="openshift-route-controller-manager/route-controller-manager-768677949c-26mgx" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.310115 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1a27259-110e-41d2-b8a8-319c784975a6-client-ca\") pod \"route-controller-manager-768677949c-26mgx\" (UID: \"f1a27259-110e-41d2-b8a8-319c784975a6\") " pod="openshift-route-controller-manager/route-controller-manager-768677949c-26mgx" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.310180 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hjrs\" (UniqueName: \"kubernetes.io/projected/f1a27259-110e-41d2-b8a8-319c784975a6-kube-api-access-5hjrs\") pod \"route-controller-manager-768677949c-26mgx\" (UID: \"f1a27259-110e-41d2-b8a8-319c784975a6\") " pod="openshift-route-controller-manager/route-controller-manager-768677949c-26mgx" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.310218 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a27259-110e-41d2-b8a8-319c784975a6-serving-cert\") pod \"route-controller-manager-768677949c-26mgx\" (UID: \"f1a27259-110e-41d2-b8a8-319c784975a6\") " pod="openshift-route-controller-manager/route-controller-manager-768677949c-26mgx" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.310236 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a27259-110e-41d2-b8a8-319c784975a6-config\") pod \"route-controller-manager-768677949c-26mgx\" (UID: \"f1a27259-110e-41d2-b8a8-319c784975a6\") " pod="openshift-route-controller-manager/route-controller-manager-768677949c-26mgx" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.311650 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a27259-110e-41d2-b8a8-319c784975a6-config\") pod \"route-controller-manager-768677949c-26mgx\" (UID: \"f1a27259-110e-41d2-b8a8-319c784975a6\") " pod="openshift-route-controller-manager/route-controller-manager-768677949c-26mgx" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.312138 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1a27259-110e-41d2-b8a8-319c784975a6-client-ca\") pod \"route-controller-manager-768677949c-26mgx\" (UID: \"f1a27259-110e-41d2-b8a8-319c784975a6\") " pod="openshift-route-controller-manager/route-controller-manager-768677949c-26mgx" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.318453 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a27259-110e-41d2-b8a8-319c784975a6-serving-cert\") pod \"route-controller-manager-768677949c-26mgx\" (UID: \"f1a27259-110e-41d2-b8a8-319c784975a6\") " pod="openshift-route-controller-manager/route-controller-manager-768677949c-26mgx" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.327852 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hjrs\" (UniqueName: \"kubernetes.io/projected/f1a27259-110e-41d2-b8a8-319c784975a6-kube-api-access-5hjrs\") pod \"route-controller-manager-768677949c-26mgx\" (UID: \"f1a27259-110e-41d2-b8a8-319c784975a6\") " pod="openshift-route-controller-manager/route-controller-manager-768677949c-26mgx" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.396417 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-768677949c-26mgx" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.405669 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71987f3f-8a3b-42b6-a6df-a90f07969caf" path="/var/lib/kubelet/pods/71987f3f-8a3b-42b6-a6df-a90f07969caf/volumes" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.510481 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.712001 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"47d23900-ea15-4b43-aeb8-4e83ea35a0bc","Type":"ContainerStarted","Data":"2a2137a3bbaa1a200b75592852b5f1adcff6821e63a9bff3455eee35c9bf2845"} Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.742245 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wt92" event={"ID":"09ac4713-0dcb-4908-8063-b6e029c132d7","Type":"ContainerStarted","Data":"30879b3becc77393014c722356aff2e79ed08d116060d24a6d8a83f6bb6db6fe"} Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.798853 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" event={"ID":"2f59a781-9159-4243-9b47-33388c25af00","Type":"ContainerStarted","Data":"b3cde2f68e9be0728b046cab7233889221a976c46ec9a15a2516d9310e9373c9"} Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.800012 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.828293 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4wt92" podStartSLOduration=3.484847652 podStartE2EDuration="44.82827846s" podCreationTimestamp="2026-02-17 09:07:51 +0000 UTC" firstStartedPulling="2026-02-17 09:07:53.920576623 +0000 UTC m=+151.463832269" lastFinishedPulling="2026-02-17 09:08:35.264007431 +0000 UTC m=+192.807263077" observedRunningTime="2026-02-17 09:08:35.766257094 +0000 UTC m=+193.309512740" watchObservedRunningTime="2026-02-17 09:08:35.82827846 +0000 UTC m=+193.371534096" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.837341 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.848119 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdbtg" event={"ID":"a3c5c067-a917-4f03-9cd1-6c5b3c1d768a","Type":"ContainerStarted","Data":"d547028872a4f5cc3a84b6d04b0fe0e1d2a9ba6cbad9a437464a77668c7920c6"} Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.859278 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" podStartSLOduration=5.859260737 podStartE2EDuration="5.859260737s" podCreationTimestamp="2026-02-17 09:08:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:08:35.829090912 +0000 UTC m=+193.372346548" watchObservedRunningTime="2026-02-17 09:08:35.859260737 +0000 UTC m=+193.402516383" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.889845 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bdbtg" podStartSLOduration=2.7843218629999997 podStartE2EDuration="40.889830653s" podCreationTimestamp="2026-02-17 09:07:55 +0000 UTC" firstStartedPulling="2026-02-17 09:07:57.14893897 +0000 UTC m=+154.692194616" lastFinishedPulling="2026-02-17 09:08:35.25444776 +0000 UTC m=+192.797703406" observedRunningTime="2026-02-17 09:08:35.889190095 +0000 UTC m=+193.432445741" watchObservedRunningTime="2026-02-17 09:08:35.889830653 +0000 UTC m=+193.433086299" Feb 17 09:08:35 crc kubenswrapper[4848]: I0217 09:08:35.991102 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-768677949c-26mgx"] Feb 17 09:08:36 crc kubenswrapper[4848]: W0217 09:08:36.001292 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1a27259_110e_41d2_b8a8_319c784975a6.slice/crio-0ca5b00e8cbd049b0d705b4c10d5a8ea7fae9bd930cb19f96a9734174ea1e0c6 WatchSource:0}: Error finding container 0ca5b00e8cbd049b0d705b4c10d5a8ea7fae9bd930cb19f96a9734174ea1e0c6: Status 404 returned error can't find the container with id 0ca5b00e8cbd049b0d705b4c10d5a8ea7fae9bd930cb19f96a9734174ea1e0c6 Feb 17 09:08:36 crc kubenswrapper[4848]: I0217 09:08:36.130446 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-45twr" podUID="b3a74de7-62f1-46c2-b518-6baa8b222b1b" containerName="registry-server" probeResult="failure" output=< Feb 17 09:08:36 crc kubenswrapper[4848]: timeout: failed to connect service ":50051" within 1s Feb 17 09:08:36 crc kubenswrapper[4848]: > Feb 17 09:08:36 crc kubenswrapper[4848]: I0217 09:08:36.856242 4848 generic.go:334] "Generic (PLEG): container finished" podID="47d23900-ea15-4b43-aeb8-4e83ea35a0bc" containerID="700447d0b9477eafb3580976634d0d6042ca2eda4798096fe13464f48087b12c" exitCode=0 Feb 17 09:08:36 crc kubenswrapper[4848]: I0217 09:08:36.856354 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"47d23900-ea15-4b43-aeb8-4e83ea35a0bc","Type":"ContainerDied","Data":"700447d0b9477eafb3580976634d0d6042ca2eda4798096fe13464f48087b12c"} Feb 17 09:08:36 crc kubenswrapper[4848]: I0217 09:08:36.864891 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-768677949c-26mgx" event={"ID":"f1a27259-110e-41d2-b8a8-319c784975a6","Type":"ContainerStarted","Data":"ffcd95b1692dccfa2cd5f659df40c53037420326bc87933a1a25b673a0a91db6"} Feb 17 09:08:36 crc kubenswrapper[4848]: I0217 09:08:36.864949 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-768677949c-26mgx" event={"ID":"f1a27259-110e-41d2-b8a8-319c784975a6","Type":"ContainerStarted","Data":"0ca5b00e8cbd049b0d705b4c10d5a8ea7fae9bd930cb19f96a9734174ea1e0c6"} Feb 17 09:08:36 crc kubenswrapper[4848]: I0217 09:08:36.923904 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-768677949c-26mgx" podStartSLOduration=6.923877728 podStartE2EDuration="6.923877728s" podCreationTimestamp="2026-02-17 09:08:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:08:36.917107173 +0000 UTC m=+194.460362819" watchObservedRunningTime="2026-02-17 09:08:36.923877728 +0000 UTC m=+194.467133374" Feb 17 09:08:37 crc kubenswrapper[4848]: I0217 09:08:37.880548 4848 generic.go:334] "Generic (PLEG): container finished" podID="7c90e73c-c31d-4c69-a555-f191b15f8cb7" containerID="5c8dd9ca654bef0f6a2b79a538f543dd70c6dba102cf53d7c16eaad0d93a6f75" exitCode=0 Feb 17 09:08:37 crc kubenswrapper[4848]: I0217 09:08:37.881826 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q8fsn" event={"ID":"7c90e73c-c31d-4c69-a555-f191b15f8cb7","Type":"ContainerDied","Data":"5c8dd9ca654bef0f6a2b79a538f543dd70c6dba102cf53d7c16eaad0d93a6f75"} Feb 17 09:08:37 crc kubenswrapper[4848]: I0217 09:08:37.881868 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-768677949c-26mgx" Feb 17 09:08:37 crc kubenswrapper[4848]: I0217 09:08:37.887967 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-768677949c-26mgx" Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.186749 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.268201 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47d23900-ea15-4b43-aeb8-4e83ea35a0bc-kube-api-access\") pod \"47d23900-ea15-4b43-aeb8-4e83ea35a0bc\" (UID: \"47d23900-ea15-4b43-aeb8-4e83ea35a0bc\") " Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.268276 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47d23900-ea15-4b43-aeb8-4e83ea35a0bc-kubelet-dir\") pod \"47d23900-ea15-4b43-aeb8-4e83ea35a0bc\" (UID: \"47d23900-ea15-4b43-aeb8-4e83ea35a0bc\") " Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.268528 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/47d23900-ea15-4b43-aeb8-4e83ea35a0bc-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "47d23900-ea15-4b43-aeb8-4e83ea35a0bc" (UID: "47d23900-ea15-4b43-aeb8-4e83ea35a0bc"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.276021 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47d23900-ea15-4b43-aeb8-4e83ea35a0bc-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "47d23900-ea15-4b43-aeb8-4e83ea35a0bc" (UID: "47d23900-ea15-4b43-aeb8-4e83ea35a0bc"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.369915 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/47d23900-ea15-4b43-aeb8-4e83ea35a0bc-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.370204 4848 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/47d23900-ea15-4b43-aeb8-4e83ea35a0bc-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.573260 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 09:08:38 crc kubenswrapper[4848]: E0217 09:08:38.573837 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47d23900-ea15-4b43-aeb8-4e83ea35a0bc" containerName="pruner" Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.573947 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="47d23900-ea15-4b43-aeb8-4e83ea35a0bc" containerName="pruner" Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.574125 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="47d23900-ea15-4b43-aeb8-4e83ea35a0bc" containerName="pruner" Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.574562 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.589698 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.674048 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5822205e-65ea-4135-816e-f7ddedd77d77-var-lock\") pod \"installer-9-crc\" (UID: \"5822205e-65ea-4135-816e-f7ddedd77d77\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.674136 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5822205e-65ea-4135-816e-f7ddedd77d77-kube-api-access\") pod \"installer-9-crc\" (UID: \"5822205e-65ea-4135-816e-f7ddedd77d77\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.674182 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5822205e-65ea-4135-816e-f7ddedd77d77-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5822205e-65ea-4135-816e-f7ddedd77d77\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.775853 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5822205e-65ea-4135-816e-f7ddedd77d77-var-lock\") pod \"installer-9-crc\" (UID: \"5822205e-65ea-4135-816e-f7ddedd77d77\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.775903 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5822205e-65ea-4135-816e-f7ddedd77d77-kube-api-access\") pod \"installer-9-crc\" (UID: \"5822205e-65ea-4135-816e-f7ddedd77d77\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.775932 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5822205e-65ea-4135-816e-f7ddedd77d77-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5822205e-65ea-4135-816e-f7ddedd77d77\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.776053 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5822205e-65ea-4135-816e-f7ddedd77d77-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5822205e-65ea-4135-816e-f7ddedd77d77\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.776112 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5822205e-65ea-4135-816e-f7ddedd77d77-var-lock\") pod \"installer-9-crc\" (UID: \"5822205e-65ea-4135-816e-f7ddedd77d77\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.796337 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5822205e-65ea-4135-816e-f7ddedd77d77-kube-api-access\") pod \"installer-9-crc\" (UID: \"5822205e-65ea-4135-816e-f7ddedd77d77\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.888254 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.888257 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"47d23900-ea15-4b43-aeb8-4e83ea35a0bc","Type":"ContainerDied","Data":"2a2137a3bbaa1a200b75592852b5f1adcff6821e63a9bff3455eee35c9bf2845"} Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.888311 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a2137a3bbaa1a200b75592852b5f1adcff6821e63a9bff3455eee35c9bf2845" Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.888914 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.892056 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q8fsn" event={"ID":"7c90e73c-c31d-4c69-a555-f191b15f8cb7","Type":"ContainerStarted","Data":"af3b80bd6448a0124c3a76f8850f11374b4031c94bf17d42f38b9f0c90fee20c"} Feb 17 09:08:38 crc kubenswrapper[4848]: I0217 09:08:38.911134 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-q8fsn" podStartSLOduration=2.727970789 podStartE2EDuration="44.911115879s" podCreationTimestamp="2026-02-17 09:07:54 +0000 UTC" firstStartedPulling="2026-02-17 09:07:56.112386196 +0000 UTC m=+153.655641842" lastFinishedPulling="2026-02-17 09:08:38.295531286 +0000 UTC m=+195.838786932" observedRunningTime="2026-02-17 09:08:38.908941896 +0000 UTC m=+196.452197532" watchObservedRunningTime="2026-02-17 09:08:38.911115879 +0000 UTC m=+196.454371525" Feb 17 09:08:39 crc kubenswrapper[4848]: I0217 09:08:39.279504 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 17 09:08:39 crc kubenswrapper[4848]: I0217 09:08:39.912133 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5822205e-65ea-4135-816e-f7ddedd77d77","Type":"ContainerStarted","Data":"02467ea22d8465dd5a45814c0f3f67241fdbbeee2e8ccc3411cbb1793392c28f"} Feb 17 09:08:39 crc kubenswrapper[4848]: I0217 09:08:39.912473 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5822205e-65ea-4135-816e-f7ddedd77d77","Type":"ContainerStarted","Data":"2cc043e6863cec491cd69ae0cae6289cc92d7ae24242e4995bf025684d8f427f"} Feb 17 09:08:39 crc kubenswrapper[4848]: I0217 09:08:39.929307 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=1.929284137 podStartE2EDuration="1.929284137s" podCreationTimestamp="2026-02-17 09:08:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:08:39.928750062 +0000 UTC m=+197.472005708" watchObservedRunningTime="2026-02-17 09:08:39.929284137 +0000 UTC m=+197.472539783" Feb 17 09:08:42 crc kubenswrapper[4848]: I0217 09:08:42.070176 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4wt92" Feb 17 09:08:42 crc kubenswrapper[4848]: I0217 09:08:42.070774 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4wt92" Feb 17 09:08:42 crc kubenswrapper[4848]: I0217 09:08:42.130524 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4wt92" Feb 17 09:08:42 crc kubenswrapper[4848]: I0217 09:08:42.328239 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-v548h" Feb 17 09:08:42 crc kubenswrapper[4848]: I0217 09:08:42.534627 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h7pln" Feb 17 09:08:42 crc kubenswrapper[4848]: I0217 09:08:42.582183 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h7pln" Feb 17 09:08:42 crc kubenswrapper[4848]: I0217 09:08:42.977807 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4wt92" Feb 17 09:08:44 crc kubenswrapper[4848]: I0217 09:08:44.474218 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-q8fsn" Feb 17 09:08:44 crc kubenswrapper[4848]: I0217 09:08:44.475174 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-q8fsn" Feb 17 09:08:44 crc kubenswrapper[4848]: I0217 09:08:44.536657 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-q8fsn" Feb 17 09:08:44 crc kubenswrapper[4848]: I0217 09:08:44.536833 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v548h"] Feb 17 09:08:44 crc kubenswrapper[4848]: I0217 09:08:44.538292 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-v548h" podUID="3f9aa20a-a818-4f1e-a1e0-345ea27c1832" containerName="registry-server" containerID="cri-o://9ba54898266cfae6cdab679777686365bed2f7cb06aa1f81e7c73a4faddc57e3" gracePeriod=2 Feb 17 09:08:44 crc kubenswrapper[4848]: I0217 09:08:44.987937 4848 generic.go:334] "Generic (PLEG): container finished" podID="3f9aa20a-a818-4f1e-a1e0-345ea27c1832" containerID="9ba54898266cfae6cdab679777686365bed2f7cb06aa1f81e7c73a4faddc57e3" exitCode=0 Feb 17 09:08:44 crc kubenswrapper[4848]: I0217 09:08:44.989007 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v548h" event={"ID":"3f9aa20a-a818-4f1e-a1e0-345ea27c1832","Type":"ContainerDied","Data":"9ba54898266cfae6cdab679777686365bed2f7cb06aa1f81e7c73a4faddc57e3"} Feb 17 09:08:45 crc kubenswrapper[4848]: I0217 09:08:45.081615 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-q8fsn" Feb 17 09:08:45 crc kubenswrapper[4848]: I0217 09:08:45.102631 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-45twr" Feb 17 09:08:45 crc kubenswrapper[4848]: I0217 09:08:45.103199 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v548h" Feb 17 09:08:45 crc kubenswrapper[4848]: I0217 09:08:45.139644 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-45twr" Feb 17 09:08:45 crc kubenswrapper[4848]: I0217 09:08:45.169997 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kn6pg\" (UniqueName: \"kubernetes.io/projected/3f9aa20a-a818-4f1e-a1e0-345ea27c1832-kube-api-access-kn6pg\") pod \"3f9aa20a-a818-4f1e-a1e0-345ea27c1832\" (UID: \"3f9aa20a-a818-4f1e-a1e0-345ea27c1832\") " Feb 17 09:08:45 crc kubenswrapper[4848]: I0217 09:08:45.170100 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f9aa20a-a818-4f1e-a1e0-345ea27c1832-catalog-content\") pod \"3f9aa20a-a818-4f1e-a1e0-345ea27c1832\" (UID: \"3f9aa20a-a818-4f1e-a1e0-345ea27c1832\") " Feb 17 09:08:45 crc kubenswrapper[4848]: I0217 09:08:45.170153 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f9aa20a-a818-4f1e-a1e0-345ea27c1832-utilities\") pod \"3f9aa20a-a818-4f1e-a1e0-345ea27c1832\" (UID: \"3f9aa20a-a818-4f1e-a1e0-345ea27c1832\") " Feb 17 09:08:45 crc kubenswrapper[4848]: I0217 09:08:45.171049 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f9aa20a-a818-4f1e-a1e0-345ea27c1832-utilities" (OuterVolumeSpecName: "utilities") pod "3f9aa20a-a818-4f1e-a1e0-345ea27c1832" (UID: "3f9aa20a-a818-4f1e-a1e0-345ea27c1832"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:08:45 crc kubenswrapper[4848]: I0217 09:08:45.200907 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f9aa20a-a818-4f1e-a1e0-345ea27c1832-kube-api-access-kn6pg" (OuterVolumeSpecName: "kube-api-access-kn6pg") pod "3f9aa20a-a818-4f1e-a1e0-345ea27c1832" (UID: "3f9aa20a-a818-4f1e-a1e0-345ea27c1832"). InnerVolumeSpecName "kube-api-access-kn6pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:08:45 crc kubenswrapper[4848]: I0217 09:08:45.228800 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f9aa20a-a818-4f1e-a1e0-345ea27c1832-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f9aa20a-a818-4f1e-a1e0-345ea27c1832" (UID: "3f9aa20a-a818-4f1e-a1e0-345ea27c1832"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:08:45 crc kubenswrapper[4848]: I0217 09:08:45.271198 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kn6pg\" (UniqueName: \"kubernetes.io/projected/3f9aa20a-a818-4f1e-a1e0-345ea27c1832-kube-api-access-kn6pg\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:45 crc kubenswrapper[4848]: I0217 09:08:45.271227 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f9aa20a-a818-4f1e-a1e0-345ea27c1832-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:45 crc kubenswrapper[4848]: I0217 09:08:45.271237 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f9aa20a-a818-4f1e-a1e0-345ea27c1832-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:45 crc kubenswrapper[4848]: I0217 09:08:45.464593 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bdbtg" Feb 17 09:08:45 crc kubenswrapper[4848]: I0217 09:08:45.464825 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bdbtg" Feb 17 09:08:45 crc kubenswrapper[4848]: I0217 09:08:45.514871 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bdbtg" Feb 17 09:08:45 crc kubenswrapper[4848]: I0217 09:08:45.923836 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q8fsn"] Feb 17 09:08:45 crc kubenswrapper[4848]: I0217 09:08:45.997258 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-v548h" event={"ID":"3f9aa20a-a818-4f1e-a1e0-345ea27c1832","Type":"ContainerDied","Data":"2a998c8b5d6418b84de119da9688188c68b2612b11c783469aee37ac845aaec6"} Feb 17 09:08:45 crc kubenswrapper[4848]: I0217 09:08:45.997454 4848 scope.go:117] "RemoveContainer" containerID="9ba54898266cfae6cdab679777686365bed2f7cb06aa1f81e7c73a4faddc57e3" Feb 17 09:08:45 crc kubenswrapper[4848]: I0217 09:08:45.997732 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-v548h" Feb 17 09:08:46 crc kubenswrapper[4848]: I0217 09:08:46.018262 4848 scope.go:117] "RemoveContainer" containerID="fd6ea090bd5c207540d8068a4d1141c9e5bedc19c6432f19fa315d9f7a59483e" Feb 17 09:08:46 crc kubenswrapper[4848]: I0217 09:08:46.021624 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-v548h"] Feb 17 09:08:46 crc kubenswrapper[4848]: I0217 09:08:46.035886 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-v548h"] Feb 17 09:08:46 crc kubenswrapper[4848]: I0217 09:08:46.036873 4848 scope.go:117] "RemoveContainer" containerID="01ac18f970beedb1abd2bdaa5e33862ad28000bfd893a0f871c460a5e3251117" Feb 17 09:08:46 crc kubenswrapper[4848]: I0217 09:08:46.051324 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bdbtg" Feb 17 09:08:46 crc kubenswrapper[4848]: I0217 09:08:46.925776 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h7pln"] Feb 17 09:08:46 crc kubenswrapper[4848]: I0217 09:08:46.926000 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h7pln" podUID="9547ed7d-6e19-4a09-84f1-8afaae314251" containerName="registry-server" containerID="cri-o://41a16cb93ae7b9d7e166b6067856ae2fd15d79feac67dc152ad44c6c853c10dd" gracePeriod=2 Feb 17 09:08:47 crc kubenswrapper[4848]: I0217 09:08:47.005348 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-q8fsn" podUID="7c90e73c-c31d-4c69-a555-f191b15f8cb7" containerName="registry-server" containerID="cri-o://af3b80bd6448a0124c3a76f8850f11374b4031c94bf17d42f38b9f0c90fee20c" gracePeriod=2 Feb 17 09:08:47 crc kubenswrapper[4848]: I0217 09:08:47.397577 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f9aa20a-a818-4f1e-a1e0-345ea27c1832" path="/var/lib/kubelet/pods/3f9aa20a-a818-4f1e-a1e0-345ea27c1832/volumes" Feb 17 09:08:47 crc kubenswrapper[4848]: I0217 09:08:47.414496 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h7pln" Feb 17 09:08:47 crc kubenswrapper[4848]: I0217 09:08:47.501628 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk6d8\" (UniqueName: \"kubernetes.io/projected/9547ed7d-6e19-4a09-84f1-8afaae314251-kube-api-access-rk6d8\") pod \"9547ed7d-6e19-4a09-84f1-8afaae314251\" (UID: \"9547ed7d-6e19-4a09-84f1-8afaae314251\") " Feb 17 09:08:47 crc kubenswrapper[4848]: I0217 09:08:47.501743 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9547ed7d-6e19-4a09-84f1-8afaae314251-utilities\") pod \"9547ed7d-6e19-4a09-84f1-8afaae314251\" (UID: \"9547ed7d-6e19-4a09-84f1-8afaae314251\") " Feb 17 09:08:47 crc kubenswrapper[4848]: I0217 09:08:47.501815 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9547ed7d-6e19-4a09-84f1-8afaae314251-catalog-content\") pod \"9547ed7d-6e19-4a09-84f1-8afaae314251\" (UID: \"9547ed7d-6e19-4a09-84f1-8afaae314251\") " Feb 17 09:08:47 crc kubenswrapper[4848]: I0217 09:08:47.503209 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9547ed7d-6e19-4a09-84f1-8afaae314251-utilities" (OuterVolumeSpecName: "utilities") pod "9547ed7d-6e19-4a09-84f1-8afaae314251" (UID: "9547ed7d-6e19-4a09-84f1-8afaae314251"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:08:47 crc kubenswrapper[4848]: I0217 09:08:47.506608 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q8fsn" Feb 17 09:08:47 crc kubenswrapper[4848]: I0217 09:08:47.506831 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9547ed7d-6e19-4a09-84f1-8afaae314251-kube-api-access-rk6d8" (OuterVolumeSpecName: "kube-api-access-rk6d8") pod "9547ed7d-6e19-4a09-84f1-8afaae314251" (UID: "9547ed7d-6e19-4a09-84f1-8afaae314251"). InnerVolumeSpecName "kube-api-access-rk6d8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:08:47 crc kubenswrapper[4848]: I0217 09:08:47.558736 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9547ed7d-6e19-4a09-84f1-8afaae314251-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9547ed7d-6e19-4a09-84f1-8afaae314251" (UID: "9547ed7d-6e19-4a09-84f1-8afaae314251"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:08:47 crc kubenswrapper[4848]: I0217 09:08:47.602891 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rgrc9\" (UniqueName: \"kubernetes.io/projected/7c90e73c-c31d-4c69-a555-f191b15f8cb7-kube-api-access-rgrc9\") pod \"7c90e73c-c31d-4c69-a555-f191b15f8cb7\" (UID: \"7c90e73c-c31d-4c69-a555-f191b15f8cb7\") " Feb 17 09:08:47 crc kubenswrapper[4848]: I0217 09:08:47.602946 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c90e73c-c31d-4c69-a555-f191b15f8cb7-catalog-content\") pod \"7c90e73c-c31d-4c69-a555-f191b15f8cb7\" (UID: \"7c90e73c-c31d-4c69-a555-f191b15f8cb7\") " Feb 17 09:08:47 crc kubenswrapper[4848]: I0217 09:08:47.603048 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c90e73c-c31d-4c69-a555-f191b15f8cb7-utilities\") pod \"7c90e73c-c31d-4c69-a555-f191b15f8cb7\" (UID: \"7c90e73c-c31d-4c69-a555-f191b15f8cb7\") " Feb 17 09:08:47 crc kubenswrapper[4848]: I0217 09:08:47.603804 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c90e73c-c31d-4c69-a555-f191b15f8cb7-utilities" (OuterVolumeSpecName: "utilities") pod "7c90e73c-c31d-4c69-a555-f191b15f8cb7" (UID: "7c90e73c-c31d-4c69-a555-f191b15f8cb7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:08:47 crc kubenswrapper[4848]: I0217 09:08:47.604174 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c90e73c-c31d-4c69-a555-f191b15f8cb7-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:47 crc kubenswrapper[4848]: I0217 09:08:47.604209 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk6d8\" (UniqueName: \"kubernetes.io/projected/9547ed7d-6e19-4a09-84f1-8afaae314251-kube-api-access-rk6d8\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:47 crc kubenswrapper[4848]: I0217 09:08:47.604224 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9547ed7d-6e19-4a09-84f1-8afaae314251-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:47 crc kubenswrapper[4848]: I0217 09:08:47.604236 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9547ed7d-6e19-4a09-84f1-8afaae314251-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:47 crc kubenswrapper[4848]: I0217 09:08:47.606059 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c90e73c-c31d-4c69-a555-f191b15f8cb7-kube-api-access-rgrc9" (OuterVolumeSpecName: "kube-api-access-rgrc9") pod "7c90e73c-c31d-4c69-a555-f191b15f8cb7" (UID: "7c90e73c-c31d-4c69-a555-f191b15f8cb7"). InnerVolumeSpecName "kube-api-access-rgrc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:08:47 crc kubenswrapper[4848]: I0217 09:08:47.625252 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c90e73c-c31d-4c69-a555-f191b15f8cb7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c90e73c-c31d-4c69-a555-f191b15f8cb7" (UID: "7c90e73c-c31d-4c69-a555-f191b15f8cb7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:08:47 crc kubenswrapper[4848]: I0217 09:08:47.706932 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rgrc9\" (UniqueName: \"kubernetes.io/projected/7c90e73c-c31d-4c69-a555-f191b15f8cb7-kube-api-access-rgrc9\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:47 crc kubenswrapper[4848]: I0217 09:08:47.706978 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c90e73c-c31d-4c69-a555-f191b15f8cb7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.011825 4848 generic.go:334] "Generic (PLEG): container finished" podID="7c90e73c-c31d-4c69-a555-f191b15f8cb7" containerID="af3b80bd6448a0124c3a76f8850f11374b4031c94bf17d42f38b9f0c90fee20c" exitCode=0 Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.011904 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-q8fsn" Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.011911 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q8fsn" event={"ID":"7c90e73c-c31d-4c69-a555-f191b15f8cb7","Type":"ContainerDied","Data":"af3b80bd6448a0124c3a76f8850f11374b4031c94bf17d42f38b9f0c90fee20c"} Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.011962 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-q8fsn" event={"ID":"7c90e73c-c31d-4c69-a555-f191b15f8cb7","Type":"ContainerDied","Data":"efe2a0e95c15dd0a3893079b920f9fe31f556b170abc521983c52d1526af36d3"} Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.011990 4848 scope.go:117] "RemoveContainer" containerID="af3b80bd6448a0124c3a76f8850f11374b4031c94bf17d42f38b9f0c90fee20c" Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.016141 4848 generic.go:334] "Generic (PLEG): container finished" podID="9547ed7d-6e19-4a09-84f1-8afaae314251" containerID="41a16cb93ae7b9d7e166b6067856ae2fd15d79feac67dc152ad44c6c853c10dd" exitCode=0 Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.016884 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h7pln" Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.023484 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7pln" event={"ID":"9547ed7d-6e19-4a09-84f1-8afaae314251","Type":"ContainerDied","Data":"41a16cb93ae7b9d7e166b6067856ae2fd15d79feac67dc152ad44c6c853c10dd"} Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.023597 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h7pln" event={"ID":"9547ed7d-6e19-4a09-84f1-8afaae314251","Type":"ContainerDied","Data":"3e951031add0e1553c2ad5ccdf955c33cb529793e8e61edfbf8808e4cf450ed5"} Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.025022 4848 scope.go:117] "RemoveContainer" containerID="5c8dd9ca654bef0f6a2b79a538f543dd70c6dba102cf53d7c16eaad0d93a6f75" Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.040150 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-q8fsn"] Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.042481 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-q8fsn"] Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.043612 4848 scope.go:117] "RemoveContainer" containerID="090c05f3bfc42e71b6e900e50217ebcfd455c7d30bcda63c7efda213082da087" Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.067942 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h7pln"] Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.070608 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h7pln"] Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.072540 4848 scope.go:117] "RemoveContainer" containerID="af3b80bd6448a0124c3a76f8850f11374b4031c94bf17d42f38b9f0c90fee20c" Feb 17 09:08:48 crc kubenswrapper[4848]: E0217 09:08:48.073259 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af3b80bd6448a0124c3a76f8850f11374b4031c94bf17d42f38b9f0c90fee20c\": container with ID starting with af3b80bd6448a0124c3a76f8850f11374b4031c94bf17d42f38b9f0c90fee20c not found: ID does not exist" containerID="af3b80bd6448a0124c3a76f8850f11374b4031c94bf17d42f38b9f0c90fee20c" Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.073285 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af3b80bd6448a0124c3a76f8850f11374b4031c94bf17d42f38b9f0c90fee20c"} err="failed to get container status \"af3b80bd6448a0124c3a76f8850f11374b4031c94bf17d42f38b9f0c90fee20c\": rpc error: code = NotFound desc = could not find container \"af3b80bd6448a0124c3a76f8850f11374b4031c94bf17d42f38b9f0c90fee20c\": container with ID starting with af3b80bd6448a0124c3a76f8850f11374b4031c94bf17d42f38b9f0c90fee20c not found: ID does not exist" Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.073305 4848 scope.go:117] "RemoveContainer" containerID="5c8dd9ca654bef0f6a2b79a538f543dd70c6dba102cf53d7c16eaad0d93a6f75" Feb 17 09:08:48 crc kubenswrapper[4848]: E0217 09:08:48.073638 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c8dd9ca654bef0f6a2b79a538f543dd70c6dba102cf53d7c16eaad0d93a6f75\": container with ID starting with 5c8dd9ca654bef0f6a2b79a538f543dd70c6dba102cf53d7c16eaad0d93a6f75 not found: ID does not exist" containerID="5c8dd9ca654bef0f6a2b79a538f543dd70c6dba102cf53d7c16eaad0d93a6f75" Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.073659 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c8dd9ca654bef0f6a2b79a538f543dd70c6dba102cf53d7c16eaad0d93a6f75"} err="failed to get container status \"5c8dd9ca654bef0f6a2b79a538f543dd70c6dba102cf53d7c16eaad0d93a6f75\": rpc error: code = NotFound desc = could not find container \"5c8dd9ca654bef0f6a2b79a538f543dd70c6dba102cf53d7c16eaad0d93a6f75\": container with ID starting with 5c8dd9ca654bef0f6a2b79a538f543dd70c6dba102cf53d7c16eaad0d93a6f75 not found: ID does not exist" Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.073671 4848 scope.go:117] "RemoveContainer" containerID="090c05f3bfc42e71b6e900e50217ebcfd455c7d30bcda63c7efda213082da087" Feb 17 09:08:48 crc kubenswrapper[4848]: E0217 09:08:48.074002 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"090c05f3bfc42e71b6e900e50217ebcfd455c7d30bcda63c7efda213082da087\": container with ID starting with 090c05f3bfc42e71b6e900e50217ebcfd455c7d30bcda63c7efda213082da087 not found: ID does not exist" containerID="090c05f3bfc42e71b6e900e50217ebcfd455c7d30bcda63c7efda213082da087" Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.074023 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"090c05f3bfc42e71b6e900e50217ebcfd455c7d30bcda63c7efda213082da087"} err="failed to get container status \"090c05f3bfc42e71b6e900e50217ebcfd455c7d30bcda63c7efda213082da087\": rpc error: code = NotFound desc = could not find container \"090c05f3bfc42e71b6e900e50217ebcfd455c7d30bcda63c7efda213082da087\": container with ID starting with 090c05f3bfc42e71b6e900e50217ebcfd455c7d30bcda63c7efda213082da087 not found: ID does not exist" Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.074037 4848 scope.go:117] "RemoveContainer" containerID="41a16cb93ae7b9d7e166b6067856ae2fd15d79feac67dc152ad44c6c853c10dd" Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.091552 4848 scope.go:117] "RemoveContainer" containerID="f5335f747c639d60688c4ca2c39ca23ad7bb34840672fb5eaf8bac4f53190c15" Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.105173 4848 scope.go:117] "RemoveContainer" containerID="b530442e931e6fecdd5aec7985c770e288acec5fc324fc51609e0ea2390c9faa" Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.123261 4848 scope.go:117] "RemoveContainer" containerID="41a16cb93ae7b9d7e166b6067856ae2fd15d79feac67dc152ad44c6c853c10dd" Feb 17 09:08:48 crc kubenswrapper[4848]: E0217 09:08:48.125063 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41a16cb93ae7b9d7e166b6067856ae2fd15d79feac67dc152ad44c6c853c10dd\": container with ID starting with 41a16cb93ae7b9d7e166b6067856ae2fd15d79feac67dc152ad44c6c853c10dd not found: ID does not exist" containerID="41a16cb93ae7b9d7e166b6067856ae2fd15d79feac67dc152ad44c6c853c10dd" Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.125095 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41a16cb93ae7b9d7e166b6067856ae2fd15d79feac67dc152ad44c6c853c10dd"} err="failed to get container status \"41a16cb93ae7b9d7e166b6067856ae2fd15d79feac67dc152ad44c6c853c10dd\": rpc error: code = NotFound desc = could not find container \"41a16cb93ae7b9d7e166b6067856ae2fd15d79feac67dc152ad44c6c853c10dd\": container with ID starting with 41a16cb93ae7b9d7e166b6067856ae2fd15d79feac67dc152ad44c6c853c10dd not found: ID does not exist" Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.125122 4848 scope.go:117] "RemoveContainer" containerID="f5335f747c639d60688c4ca2c39ca23ad7bb34840672fb5eaf8bac4f53190c15" Feb 17 09:08:48 crc kubenswrapper[4848]: E0217 09:08:48.125386 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5335f747c639d60688c4ca2c39ca23ad7bb34840672fb5eaf8bac4f53190c15\": container with ID starting with f5335f747c639d60688c4ca2c39ca23ad7bb34840672fb5eaf8bac4f53190c15 not found: ID does not exist" containerID="f5335f747c639d60688c4ca2c39ca23ad7bb34840672fb5eaf8bac4f53190c15" Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.125411 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5335f747c639d60688c4ca2c39ca23ad7bb34840672fb5eaf8bac4f53190c15"} err="failed to get container status \"f5335f747c639d60688c4ca2c39ca23ad7bb34840672fb5eaf8bac4f53190c15\": rpc error: code = NotFound desc = could not find container \"f5335f747c639d60688c4ca2c39ca23ad7bb34840672fb5eaf8bac4f53190c15\": container with ID starting with f5335f747c639d60688c4ca2c39ca23ad7bb34840672fb5eaf8bac4f53190c15 not found: ID does not exist" Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.125433 4848 scope.go:117] "RemoveContainer" containerID="b530442e931e6fecdd5aec7985c770e288acec5fc324fc51609e0ea2390c9faa" Feb 17 09:08:48 crc kubenswrapper[4848]: E0217 09:08:48.125813 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b530442e931e6fecdd5aec7985c770e288acec5fc324fc51609e0ea2390c9faa\": container with ID starting with b530442e931e6fecdd5aec7985c770e288acec5fc324fc51609e0ea2390c9faa not found: ID does not exist" containerID="b530442e931e6fecdd5aec7985c770e288acec5fc324fc51609e0ea2390c9faa" Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.125838 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b530442e931e6fecdd5aec7985c770e288acec5fc324fc51609e0ea2390c9faa"} err="failed to get container status \"b530442e931e6fecdd5aec7985c770e288acec5fc324fc51609e0ea2390c9faa\": rpc error: code = NotFound desc = could not find container \"b530442e931e6fecdd5aec7985c770e288acec5fc324fc51609e0ea2390c9faa\": container with ID starting with b530442e931e6fecdd5aec7985c770e288acec5fc324fc51609e0ea2390c9faa not found: ID does not exist" Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.329897 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bdbtg"] Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.773466 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.773566 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.773650 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.774609 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8"} pod="openshift-machine-config-operator/machine-config-daemon-stvnz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 09:08:48 crc kubenswrapper[4848]: I0217 09:08:48.774792 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" containerID="cri-o://d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8" gracePeriod=600 Feb 17 09:08:49 crc kubenswrapper[4848]: I0217 09:08:49.036786 4848 generic.go:334] "Generic (PLEG): container finished" podID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerID="d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8" exitCode=0 Feb 17 09:08:49 crc kubenswrapper[4848]: I0217 09:08:49.037135 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerDied","Data":"d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8"} Feb 17 09:08:49 crc kubenswrapper[4848]: I0217 09:08:49.043513 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bdbtg" podUID="a3c5c067-a917-4f03-9cd1-6c5b3c1d768a" containerName="registry-server" containerID="cri-o://d547028872a4f5cc3a84b6d04b0fe0e1d2a9ba6cbad9a437464a77668c7920c6" gracePeriod=2 Feb 17 09:08:49 crc kubenswrapper[4848]: I0217 09:08:49.390424 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c90e73c-c31d-4c69-a555-f191b15f8cb7" path="/var/lib/kubelet/pods/7c90e73c-c31d-4c69-a555-f191b15f8cb7/volumes" Feb 17 09:08:49 crc kubenswrapper[4848]: I0217 09:08:49.391562 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9547ed7d-6e19-4a09-84f1-8afaae314251" path="/var/lib/kubelet/pods/9547ed7d-6e19-4a09-84f1-8afaae314251/volumes" Feb 17 09:08:49 crc kubenswrapper[4848]: I0217 09:08:49.539549 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bdbtg" Feb 17 09:08:49 crc kubenswrapper[4848]: I0217 09:08:49.632367 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3c5c067-a917-4f03-9cd1-6c5b3c1d768a-catalog-content\") pod \"a3c5c067-a917-4f03-9cd1-6c5b3c1d768a\" (UID: \"a3c5c067-a917-4f03-9cd1-6c5b3c1d768a\") " Feb 17 09:08:49 crc kubenswrapper[4848]: I0217 09:08:49.632437 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wnkj\" (UniqueName: \"kubernetes.io/projected/a3c5c067-a917-4f03-9cd1-6c5b3c1d768a-kube-api-access-7wnkj\") pod \"a3c5c067-a917-4f03-9cd1-6c5b3c1d768a\" (UID: \"a3c5c067-a917-4f03-9cd1-6c5b3c1d768a\") " Feb 17 09:08:49 crc kubenswrapper[4848]: I0217 09:08:49.632481 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3c5c067-a917-4f03-9cd1-6c5b3c1d768a-utilities\") pod \"a3c5c067-a917-4f03-9cd1-6c5b3c1d768a\" (UID: \"a3c5c067-a917-4f03-9cd1-6c5b3c1d768a\") " Feb 17 09:08:49 crc kubenswrapper[4848]: I0217 09:08:49.633479 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3c5c067-a917-4f03-9cd1-6c5b3c1d768a-utilities" (OuterVolumeSpecName: "utilities") pod "a3c5c067-a917-4f03-9cd1-6c5b3c1d768a" (UID: "a3c5c067-a917-4f03-9cd1-6c5b3c1d768a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:08:49 crc kubenswrapper[4848]: I0217 09:08:49.638915 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3c5c067-a917-4f03-9cd1-6c5b3c1d768a-kube-api-access-7wnkj" (OuterVolumeSpecName: "kube-api-access-7wnkj") pod "a3c5c067-a917-4f03-9cd1-6c5b3c1d768a" (UID: "a3c5c067-a917-4f03-9cd1-6c5b3c1d768a"). InnerVolumeSpecName "kube-api-access-7wnkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:08:49 crc kubenswrapper[4848]: I0217 09:08:49.733950 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wnkj\" (UniqueName: \"kubernetes.io/projected/a3c5c067-a917-4f03-9cd1-6c5b3c1d768a-kube-api-access-7wnkj\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:49 crc kubenswrapper[4848]: I0217 09:08:49.733983 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3c5c067-a917-4f03-9cd1-6c5b3c1d768a-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:49 crc kubenswrapper[4848]: I0217 09:08:49.773660 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3c5c067-a917-4f03-9cd1-6c5b3c1d768a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3c5c067-a917-4f03-9cd1-6c5b3c1d768a" (UID: "a3c5c067-a917-4f03-9cd1-6c5b3c1d768a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:08:49 crc kubenswrapper[4848]: I0217 09:08:49.834786 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3c5c067-a917-4f03-9cd1-6c5b3c1d768a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:50 crc kubenswrapper[4848]: I0217 09:08:50.054796 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerStarted","Data":"c10c26abd2aa6ffa5969a35af04b1b8e427c06a994c5d6c12ac096d788cf26a4"} Feb 17 09:08:50 crc kubenswrapper[4848]: I0217 09:08:50.058051 4848 generic.go:334] "Generic (PLEG): container finished" podID="a3c5c067-a917-4f03-9cd1-6c5b3c1d768a" containerID="d547028872a4f5cc3a84b6d04b0fe0e1d2a9ba6cbad9a437464a77668c7920c6" exitCode=0 Feb 17 09:08:50 crc kubenswrapper[4848]: I0217 09:08:50.058150 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bdbtg" Feb 17 09:08:50 crc kubenswrapper[4848]: I0217 09:08:50.060189 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdbtg" event={"ID":"a3c5c067-a917-4f03-9cd1-6c5b3c1d768a","Type":"ContainerDied","Data":"d547028872a4f5cc3a84b6d04b0fe0e1d2a9ba6cbad9a437464a77668c7920c6"} Feb 17 09:08:50 crc kubenswrapper[4848]: I0217 09:08:50.060298 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bdbtg" event={"ID":"a3c5c067-a917-4f03-9cd1-6c5b3c1d768a","Type":"ContainerDied","Data":"d7f37cf00f13bfa71b6a6fb43325db25bc30ae886626a42aaa08da81ff192cb0"} Feb 17 09:08:50 crc kubenswrapper[4848]: I0217 09:08:50.060383 4848 scope.go:117] "RemoveContainer" containerID="d547028872a4f5cc3a84b6d04b0fe0e1d2a9ba6cbad9a437464a77668c7920c6" Feb 17 09:08:50 crc kubenswrapper[4848]: I0217 09:08:50.088910 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bdbtg"] Feb 17 09:08:50 crc kubenswrapper[4848]: I0217 09:08:50.089924 4848 scope.go:117] "RemoveContainer" containerID="8f09ef8a3481ce7b999cd972f2f5cdc10575b2629ba5c27f43a4e1d5491a1577" Feb 17 09:08:50 crc kubenswrapper[4848]: I0217 09:08:50.097431 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bdbtg"] Feb 17 09:08:50 crc kubenswrapper[4848]: I0217 09:08:50.110078 4848 scope.go:117] "RemoveContainer" containerID="c761fbb1364d4a774383d65d14baf8c9753f68372cffb85d3c54a5e475d85693" Feb 17 09:08:50 crc kubenswrapper[4848]: I0217 09:08:50.142423 4848 scope.go:117] "RemoveContainer" containerID="d547028872a4f5cc3a84b6d04b0fe0e1d2a9ba6cbad9a437464a77668c7920c6" Feb 17 09:08:50 crc kubenswrapper[4848]: E0217 09:08:50.142987 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d547028872a4f5cc3a84b6d04b0fe0e1d2a9ba6cbad9a437464a77668c7920c6\": container with ID starting with d547028872a4f5cc3a84b6d04b0fe0e1d2a9ba6cbad9a437464a77668c7920c6 not found: ID does not exist" containerID="d547028872a4f5cc3a84b6d04b0fe0e1d2a9ba6cbad9a437464a77668c7920c6" Feb 17 09:08:50 crc kubenswrapper[4848]: I0217 09:08:50.143024 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d547028872a4f5cc3a84b6d04b0fe0e1d2a9ba6cbad9a437464a77668c7920c6"} err="failed to get container status \"d547028872a4f5cc3a84b6d04b0fe0e1d2a9ba6cbad9a437464a77668c7920c6\": rpc error: code = NotFound desc = could not find container \"d547028872a4f5cc3a84b6d04b0fe0e1d2a9ba6cbad9a437464a77668c7920c6\": container with ID starting with d547028872a4f5cc3a84b6d04b0fe0e1d2a9ba6cbad9a437464a77668c7920c6 not found: ID does not exist" Feb 17 09:08:50 crc kubenswrapper[4848]: I0217 09:08:50.143049 4848 scope.go:117] "RemoveContainer" containerID="8f09ef8a3481ce7b999cd972f2f5cdc10575b2629ba5c27f43a4e1d5491a1577" Feb 17 09:08:50 crc kubenswrapper[4848]: E0217 09:08:50.143477 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f09ef8a3481ce7b999cd972f2f5cdc10575b2629ba5c27f43a4e1d5491a1577\": container with ID starting with 8f09ef8a3481ce7b999cd972f2f5cdc10575b2629ba5c27f43a4e1d5491a1577 not found: ID does not exist" containerID="8f09ef8a3481ce7b999cd972f2f5cdc10575b2629ba5c27f43a4e1d5491a1577" Feb 17 09:08:50 crc kubenswrapper[4848]: I0217 09:08:50.143513 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f09ef8a3481ce7b999cd972f2f5cdc10575b2629ba5c27f43a4e1d5491a1577"} err="failed to get container status \"8f09ef8a3481ce7b999cd972f2f5cdc10575b2629ba5c27f43a4e1d5491a1577\": rpc error: code = NotFound desc = could not find container \"8f09ef8a3481ce7b999cd972f2f5cdc10575b2629ba5c27f43a4e1d5491a1577\": container with ID starting with 8f09ef8a3481ce7b999cd972f2f5cdc10575b2629ba5c27f43a4e1d5491a1577 not found: ID does not exist" Feb 17 09:08:50 crc kubenswrapper[4848]: I0217 09:08:50.143536 4848 scope.go:117] "RemoveContainer" containerID="c761fbb1364d4a774383d65d14baf8c9753f68372cffb85d3c54a5e475d85693" Feb 17 09:08:50 crc kubenswrapper[4848]: E0217 09:08:50.143912 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c761fbb1364d4a774383d65d14baf8c9753f68372cffb85d3c54a5e475d85693\": container with ID starting with c761fbb1364d4a774383d65d14baf8c9753f68372cffb85d3c54a5e475d85693 not found: ID does not exist" containerID="c761fbb1364d4a774383d65d14baf8c9753f68372cffb85d3c54a5e475d85693" Feb 17 09:08:50 crc kubenswrapper[4848]: I0217 09:08:50.143937 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c761fbb1364d4a774383d65d14baf8c9753f68372cffb85d3c54a5e475d85693"} err="failed to get container status \"c761fbb1364d4a774383d65d14baf8c9753f68372cffb85d3c54a5e475d85693\": rpc error: code = NotFound desc = could not find container \"c761fbb1364d4a774383d65d14baf8c9753f68372cffb85d3c54a5e475d85693\": container with ID starting with c761fbb1364d4a774383d65d14baf8c9753f68372cffb85d3c54a5e475d85693 not found: ID does not exist" Feb 17 09:08:50 crc kubenswrapper[4848]: I0217 09:08:50.736956 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k"] Feb 17 09:08:50 crc kubenswrapper[4848]: I0217 09:08:50.737265 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" podUID="2f59a781-9159-4243-9b47-33388c25af00" containerName="controller-manager" containerID="cri-o://b3cde2f68e9be0728b046cab7233889221a976c46ec9a15a2516d9310e9373c9" gracePeriod=30 Feb 17 09:08:50 crc kubenswrapper[4848]: I0217 09:08:50.753217 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-768677949c-26mgx"] Feb 17 09:08:50 crc kubenswrapper[4848]: I0217 09:08:50.753494 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-768677949c-26mgx" podUID="f1a27259-110e-41d2-b8a8-319c784975a6" containerName="route-controller-manager" containerID="cri-o://ffcd95b1692dccfa2cd5f659df40c53037420326bc87933a1a25b673a0a91db6" gracePeriod=30 Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.065674 4848 generic.go:334] "Generic (PLEG): container finished" podID="2f59a781-9159-4243-9b47-33388c25af00" containerID="b3cde2f68e9be0728b046cab7233889221a976c46ec9a15a2516d9310e9373c9" exitCode=0 Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.065778 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" event={"ID":"2f59a781-9159-4243-9b47-33388c25af00","Type":"ContainerDied","Data":"b3cde2f68e9be0728b046cab7233889221a976c46ec9a15a2516d9310e9373c9"} Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.068614 4848 generic.go:334] "Generic (PLEG): container finished" podID="f1a27259-110e-41d2-b8a8-319c784975a6" containerID="ffcd95b1692dccfa2cd5f659df40c53037420326bc87933a1a25b673a0a91db6" exitCode=0 Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.068694 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-768677949c-26mgx" event={"ID":"f1a27259-110e-41d2-b8a8-319c784975a6","Type":"ContainerDied","Data":"ffcd95b1692dccfa2cd5f659df40c53037420326bc87933a1a25b673a0a91db6"} Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.334559 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-768677949c-26mgx" Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.391270 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3c5c067-a917-4f03-9cd1-6c5b3c1d768a" path="/var/lib/kubelet/pods/a3c5c067-a917-4f03-9cd1-6c5b3c1d768a/volumes" Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.415810 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.456815 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f59a781-9159-4243-9b47-33388c25af00-config\") pod \"2f59a781-9159-4243-9b47-33388c25af00\" (UID: \"2f59a781-9159-4243-9b47-33388c25af00\") " Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.456915 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f59a781-9159-4243-9b47-33388c25af00-serving-cert\") pod \"2f59a781-9159-4243-9b47-33388c25af00\" (UID: \"2f59a781-9159-4243-9b47-33388c25af00\") " Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.456948 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f59a781-9159-4243-9b47-33388c25af00-proxy-ca-bundles\") pod \"2f59a781-9159-4243-9b47-33388c25af00\" (UID: \"2f59a781-9159-4243-9b47-33388c25af00\") " Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.456969 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a27259-110e-41d2-b8a8-319c784975a6-serving-cert\") pod \"f1a27259-110e-41d2-b8a8-319c784975a6\" (UID: \"f1a27259-110e-41d2-b8a8-319c784975a6\") " Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.457039 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1a27259-110e-41d2-b8a8-319c784975a6-client-ca\") pod \"f1a27259-110e-41d2-b8a8-319c784975a6\" (UID: \"f1a27259-110e-41d2-b8a8-319c784975a6\") " Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.457092 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hjrs\" (UniqueName: \"kubernetes.io/projected/f1a27259-110e-41d2-b8a8-319c784975a6-kube-api-access-5hjrs\") pod \"f1a27259-110e-41d2-b8a8-319c784975a6\" (UID: \"f1a27259-110e-41d2-b8a8-319c784975a6\") " Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.457127 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsd2d\" (UniqueName: \"kubernetes.io/projected/2f59a781-9159-4243-9b47-33388c25af00-kube-api-access-xsd2d\") pod \"2f59a781-9159-4243-9b47-33388c25af00\" (UID: \"2f59a781-9159-4243-9b47-33388c25af00\") " Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.457176 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f59a781-9159-4243-9b47-33388c25af00-client-ca\") pod \"2f59a781-9159-4243-9b47-33388c25af00\" (UID: \"2f59a781-9159-4243-9b47-33388c25af00\") " Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.457195 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a27259-110e-41d2-b8a8-319c784975a6-config\") pod \"f1a27259-110e-41d2-b8a8-319c784975a6\" (UID: \"f1a27259-110e-41d2-b8a8-319c784975a6\") " Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.458301 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f59a781-9159-4243-9b47-33388c25af00-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2f59a781-9159-4243-9b47-33388c25af00" (UID: "2f59a781-9159-4243-9b47-33388c25af00"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.458444 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f59a781-9159-4243-9b47-33388c25af00-config" (OuterVolumeSpecName: "config") pod "2f59a781-9159-4243-9b47-33388c25af00" (UID: "2f59a781-9159-4243-9b47-33388c25af00"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.458890 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1a27259-110e-41d2-b8a8-319c784975a6-client-ca" (OuterVolumeSpecName: "client-ca") pod "f1a27259-110e-41d2-b8a8-319c784975a6" (UID: "f1a27259-110e-41d2-b8a8-319c784975a6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.458971 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1a27259-110e-41d2-b8a8-319c784975a6-config" (OuterVolumeSpecName: "config") pod "f1a27259-110e-41d2-b8a8-319c784975a6" (UID: "f1a27259-110e-41d2-b8a8-319c784975a6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.458969 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f59a781-9159-4243-9b47-33388c25af00-client-ca" (OuterVolumeSpecName: "client-ca") pod "2f59a781-9159-4243-9b47-33388c25af00" (UID: "2f59a781-9159-4243-9b47-33388c25af00"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.466024 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f59a781-9159-4243-9b47-33388c25af00-kube-api-access-xsd2d" (OuterVolumeSpecName: "kube-api-access-xsd2d") pod "2f59a781-9159-4243-9b47-33388c25af00" (UID: "2f59a781-9159-4243-9b47-33388c25af00"). InnerVolumeSpecName "kube-api-access-xsd2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.469519 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f59a781-9159-4243-9b47-33388c25af00-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2f59a781-9159-4243-9b47-33388c25af00" (UID: "2f59a781-9159-4243-9b47-33388c25af00"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.469538 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f1a27259-110e-41d2-b8a8-319c784975a6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f1a27259-110e-41d2-b8a8-319c784975a6" (UID: "f1a27259-110e-41d2-b8a8-319c784975a6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.469665 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1a27259-110e-41d2-b8a8-319c784975a6-kube-api-access-5hjrs" (OuterVolumeSpecName: "kube-api-access-5hjrs") pod "f1a27259-110e-41d2-b8a8-319c784975a6" (UID: "f1a27259-110e-41d2-b8a8-319c784975a6"). InnerVolumeSpecName "kube-api-access-5hjrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.559372 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f59a781-9159-4243-9b47-33388c25af00-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.559615 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f1a27259-110e-41d2-b8a8-319c784975a6-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.559633 4848 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2f59a781-9159-4243-9b47-33388c25af00-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.559649 4848 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f1a27259-110e-41d2-b8a8-319c784975a6-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.559660 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hjrs\" (UniqueName: \"kubernetes.io/projected/f1a27259-110e-41d2-b8a8-319c784975a6-kube-api-access-5hjrs\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.559674 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsd2d\" (UniqueName: \"kubernetes.io/projected/2f59a781-9159-4243-9b47-33388c25af00-kube-api-access-xsd2d\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.559685 4848 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f59a781-9159-4243-9b47-33388c25af00-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.559696 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1a27259-110e-41d2-b8a8-319c784975a6-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:51 crc kubenswrapper[4848]: I0217 09:08:51.559707 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f59a781-9159-4243-9b47-33388c25af00-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.077682 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-768677949c-26mgx" event={"ID":"f1a27259-110e-41d2-b8a8-319c784975a6","Type":"ContainerDied","Data":"0ca5b00e8cbd049b0d705b4c10d5a8ea7fae9bd930cb19f96a9734174ea1e0c6"} Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.077755 4848 scope.go:117] "RemoveContainer" containerID="ffcd95b1692dccfa2cd5f659df40c53037420326bc87933a1a25b673a0a91db6" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.079433 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-768677949c-26mgx" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.088526 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" event={"ID":"2f59a781-9159-4243-9b47-33388c25af00","Type":"ContainerDied","Data":"04dfe9f350851317949cb4430e28e473dedb915d710e397b2f47ef19a8937cd1"} Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.088713 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.119209 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6d4744b6dc-djvhk"] Feb 17 09:08:52 crc kubenswrapper[4848]: E0217 09:08:52.119582 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9547ed7d-6e19-4a09-84f1-8afaae314251" containerName="extract-utilities" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.119671 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9547ed7d-6e19-4a09-84f1-8afaae314251" containerName="extract-utilities" Feb 17 09:08:52 crc kubenswrapper[4848]: E0217 09:08:52.119814 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f9aa20a-a818-4f1e-a1e0-345ea27c1832" containerName="registry-server" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.119923 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f9aa20a-a818-4f1e-a1e0-345ea27c1832" containerName="registry-server" Feb 17 09:08:52 crc kubenswrapper[4848]: E0217 09:08:52.120005 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f9aa20a-a818-4f1e-a1e0-345ea27c1832" containerName="extract-content" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.120095 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f9aa20a-a818-4f1e-a1e0-345ea27c1832" containerName="extract-content" Feb 17 09:08:52 crc kubenswrapper[4848]: E0217 09:08:52.120175 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c90e73c-c31d-4c69-a555-f191b15f8cb7" containerName="registry-server" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.120279 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c90e73c-c31d-4c69-a555-f191b15f8cb7" containerName="registry-server" Feb 17 09:08:52 crc kubenswrapper[4848]: E0217 09:08:52.120397 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c90e73c-c31d-4c69-a555-f191b15f8cb7" containerName="extract-content" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.120507 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c90e73c-c31d-4c69-a555-f191b15f8cb7" containerName="extract-content" Feb 17 09:08:52 crc kubenswrapper[4848]: E0217 09:08:52.120598 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c5c067-a917-4f03-9cd1-6c5b3c1d768a" containerName="registry-server" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.120686 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c5c067-a917-4f03-9cd1-6c5b3c1d768a" containerName="registry-server" Feb 17 09:08:52 crc kubenswrapper[4848]: E0217 09:08:52.120788 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f59a781-9159-4243-9b47-33388c25af00" containerName="controller-manager" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.120867 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f59a781-9159-4243-9b47-33388c25af00" containerName="controller-manager" Feb 17 09:08:52 crc kubenswrapper[4848]: E0217 09:08:52.120947 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9547ed7d-6e19-4a09-84f1-8afaae314251" containerName="registry-server" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.121021 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9547ed7d-6e19-4a09-84f1-8afaae314251" containerName="registry-server" Feb 17 09:08:52 crc kubenswrapper[4848]: E0217 09:08:52.121116 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c5c067-a917-4f03-9cd1-6c5b3c1d768a" containerName="extract-utilities" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.121192 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c5c067-a917-4f03-9cd1-6c5b3c1d768a" containerName="extract-utilities" Feb 17 09:08:52 crc kubenswrapper[4848]: E0217 09:08:52.121274 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9547ed7d-6e19-4a09-84f1-8afaae314251" containerName="extract-content" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.121354 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9547ed7d-6e19-4a09-84f1-8afaae314251" containerName="extract-content" Feb 17 09:08:52 crc kubenswrapper[4848]: E0217 09:08:52.121432 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1a27259-110e-41d2-b8a8-319c784975a6" containerName="route-controller-manager" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.121507 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1a27259-110e-41d2-b8a8-319c784975a6" containerName="route-controller-manager" Feb 17 09:08:52 crc kubenswrapper[4848]: E0217 09:08:52.121592 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3c5c067-a917-4f03-9cd1-6c5b3c1d768a" containerName="extract-content" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.121667 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3c5c067-a917-4f03-9cd1-6c5b3c1d768a" containerName="extract-content" Feb 17 09:08:52 crc kubenswrapper[4848]: E0217 09:08:52.121774 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c90e73c-c31d-4c69-a555-f191b15f8cb7" containerName="extract-utilities" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.121854 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c90e73c-c31d-4c69-a555-f191b15f8cb7" containerName="extract-utilities" Feb 17 09:08:52 crc kubenswrapper[4848]: E0217 09:08:52.121931 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f9aa20a-a818-4f1e-a1e0-345ea27c1832" containerName="extract-utilities" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.122004 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f9aa20a-a818-4f1e-a1e0-345ea27c1832" containerName="extract-utilities" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.134782 4848 scope.go:117] "RemoveContainer" containerID="b3cde2f68e9be0728b046cab7233889221a976c46ec9a15a2516d9310e9373c9" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.135033 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f59a781-9159-4243-9b47-33388c25af00" containerName="controller-manager" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.135062 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3c5c067-a917-4f03-9cd1-6c5b3c1d768a" containerName="registry-server" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.135082 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="9547ed7d-6e19-4a09-84f1-8afaae314251" containerName="registry-server" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.135097 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c90e73c-c31d-4c69-a555-f191b15f8cb7" containerName="registry-server" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.135112 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f9aa20a-a818-4f1e-a1e0-345ea27c1832" containerName="registry-server" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.135125 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1a27259-110e-41d2-b8a8-319c784975a6" containerName="route-controller-manager" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.135795 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958"] Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.137100 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.137633 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.149066 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.149179 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.149341 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.149511 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.149608 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.150029 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.150116 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.172867 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5563bca-6a52-4c54-956f-52350b8acb5c-serving-cert\") pod \"route-controller-manager-c89cdd98b-8p958\" (UID: \"a5563bca-6a52-4c54-956f-52350b8acb5c\") " pod="openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.172909 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-client-ca\") pod \"controller-manager-6d4744b6dc-djvhk\" (UID: \"f546d757-f0f3-456c-b6bb-6e304b0bf8e5\") " pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.172933 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-serving-cert\") pod \"controller-manager-6d4744b6dc-djvhk\" (UID: \"f546d757-f0f3-456c-b6bb-6e304b0bf8e5\") " pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.172969 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-proxy-ca-bundles\") pod \"controller-manager-6d4744b6dc-djvhk\" (UID: \"f546d757-f0f3-456c-b6bb-6e304b0bf8e5\") " pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.172989 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5563bca-6a52-4c54-956f-52350b8acb5c-client-ca\") pod \"route-controller-manager-c89cdd98b-8p958\" (UID: \"a5563bca-6a52-4c54-956f-52350b8acb5c\") " pod="openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.173483 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.173696 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.173894 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5563bca-6a52-4c54-956f-52350b8acb5c-config\") pod \"route-controller-manager-c89cdd98b-8p958\" (UID: \"a5563bca-6a52-4c54-956f-52350b8acb5c\") " pod="openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.173987 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cw69\" (UniqueName: \"kubernetes.io/projected/a5563bca-6a52-4c54-956f-52350b8acb5c-kube-api-access-8cw69\") pod \"route-controller-manager-c89cdd98b-8p958\" (UID: \"a5563bca-6a52-4c54-956f-52350b8acb5c\") " pod="openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.174096 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-config\") pod \"controller-manager-6d4744b6dc-djvhk\" (UID: \"f546d757-f0f3-456c-b6bb-6e304b0bf8e5\") " pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.174214 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qgv8\" (UniqueName: \"kubernetes.io/projected/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-kube-api-access-9qgv8\") pod \"controller-manager-6d4744b6dc-djvhk\" (UID: \"f546d757-f0f3-456c-b6bb-6e304b0bf8e5\") " pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.175052 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.177029 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.183065 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.185961 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958"] Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.186171 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.193654 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d4744b6dc-djvhk"] Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.199068 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k"] Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.205037 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6748b9c5fd-rqx7k"] Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.222318 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-768677949c-26mgx"] Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.225227 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-768677949c-26mgx"] Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.275853 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5563bca-6a52-4c54-956f-52350b8acb5c-config\") pod \"route-controller-manager-c89cdd98b-8p958\" (UID: \"a5563bca-6a52-4c54-956f-52350b8acb5c\") " pod="openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.275919 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cw69\" (UniqueName: \"kubernetes.io/projected/a5563bca-6a52-4c54-956f-52350b8acb5c-kube-api-access-8cw69\") pod \"route-controller-manager-c89cdd98b-8p958\" (UID: \"a5563bca-6a52-4c54-956f-52350b8acb5c\") " pod="openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.275957 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-config\") pod \"controller-manager-6d4744b6dc-djvhk\" (UID: \"f546d757-f0f3-456c-b6bb-6e304b0bf8e5\") " pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.276002 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qgv8\" (UniqueName: \"kubernetes.io/projected/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-kube-api-access-9qgv8\") pod \"controller-manager-6d4744b6dc-djvhk\" (UID: \"f546d757-f0f3-456c-b6bb-6e304b0bf8e5\") " pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.276033 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5563bca-6a52-4c54-956f-52350b8acb5c-serving-cert\") pod \"route-controller-manager-c89cdd98b-8p958\" (UID: \"a5563bca-6a52-4c54-956f-52350b8acb5c\") " pod="openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.276054 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-client-ca\") pod \"controller-manager-6d4744b6dc-djvhk\" (UID: \"f546d757-f0f3-456c-b6bb-6e304b0bf8e5\") " pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.276077 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-serving-cert\") pod \"controller-manager-6d4744b6dc-djvhk\" (UID: \"f546d757-f0f3-456c-b6bb-6e304b0bf8e5\") " pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.276118 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-proxy-ca-bundles\") pod \"controller-manager-6d4744b6dc-djvhk\" (UID: \"f546d757-f0f3-456c-b6bb-6e304b0bf8e5\") " pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.276143 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5563bca-6a52-4c54-956f-52350b8acb5c-client-ca\") pod \"route-controller-manager-c89cdd98b-8p958\" (UID: \"a5563bca-6a52-4c54-956f-52350b8acb5c\") " pod="openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.277823 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5563bca-6a52-4c54-956f-52350b8acb5c-client-ca\") pod \"route-controller-manager-c89cdd98b-8p958\" (UID: \"a5563bca-6a52-4c54-956f-52350b8acb5c\") " pod="openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.277928 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5563bca-6a52-4c54-956f-52350b8acb5c-config\") pod \"route-controller-manager-c89cdd98b-8p958\" (UID: \"a5563bca-6a52-4c54-956f-52350b8acb5c\") " pod="openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.278064 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-config\") pod \"controller-manager-6d4744b6dc-djvhk\" (UID: \"f546d757-f0f3-456c-b6bb-6e304b0bf8e5\") " pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.278480 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-client-ca\") pod \"controller-manager-6d4744b6dc-djvhk\" (UID: \"f546d757-f0f3-456c-b6bb-6e304b0bf8e5\") " pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.278577 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-proxy-ca-bundles\") pod \"controller-manager-6d4744b6dc-djvhk\" (UID: \"f546d757-f0f3-456c-b6bb-6e304b0bf8e5\") " pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.280830 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5563bca-6a52-4c54-956f-52350b8acb5c-serving-cert\") pod \"route-controller-manager-c89cdd98b-8p958\" (UID: \"a5563bca-6a52-4c54-956f-52350b8acb5c\") " pod="openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.290434 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-serving-cert\") pod \"controller-manager-6d4744b6dc-djvhk\" (UID: \"f546d757-f0f3-456c-b6bb-6e304b0bf8e5\") " pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.293067 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qgv8\" (UniqueName: \"kubernetes.io/projected/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-kube-api-access-9qgv8\") pod \"controller-manager-6d4744b6dc-djvhk\" (UID: \"f546d757-f0f3-456c-b6bb-6e304b0bf8e5\") " pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.294062 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cw69\" (UniqueName: \"kubernetes.io/projected/a5563bca-6a52-4c54-956f-52350b8acb5c-kube-api-access-8cw69\") pod \"route-controller-manager-c89cdd98b-8p958\" (UID: \"a5563bca-6a52-4c54-956f-52350b8acb5c\") " pod="openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.490875 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.497222 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.817086 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6d4744b6dc-djvhk"] Feb 17 09:08:52 crc kubenswrapper[4848]: W0217 09:08:52.829132 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf546d757_f0f3_456c_b6bb_6e304b0bf8e5.slice/crio-f1437fe622ccef7f03a06b35eca62f85d0364591c0a49989883c4d29ad75bd62 WatchSource:0}: Error finding container f1437fe622ccef7f03a06b35eca62f85d0364591c0a49989883c4d29ad75bd62: Status 404 returned error can't find the container with id f1437fe622ccef7f03a06b35eca62f85d0364591c0a49989883c4d29ad75bd62 Feb 17 09:08:52 crc kubenswrapper[4848]: I0217 09:08:52.953987 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958"] Feb 17 09:08:52 crc kubenswrapper[4848]: W0217 09:08:52.964782 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5563bca_6a52_4c54_956f_52350b8acb5c.slice/crio-52187df71743a897f3a4048b235bda9dfaeb95625ce04327e7e0e164ed5cc0aa WatchSource:0}: Error finding container 52187df71743a897f3a4048b235bda9dfaeb95625ce04327e7e0e164ed5cc0aa: Status 404 returned error can't find the container with id 52187df71743a897f3a4048b235bda9dfaeb95625ce04327e7e0e164ed5cc0aa Feb 17 09:08:53 crc kubenswrapper[4848]: I0217 09:08:53.111246 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" event={"ID":"f546d757-f0f3-456c-b6bb-6e304b0bf8e5","Type":"ContainerStarted","Data":"a66025d95ef652dc0effe9a52b657fb89de697b9b14a7945d6f248709513c903"} Feb 17 09:08:53 crc kubenswrapper[4848]: I0217 09:08:53.111676 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" Feb 17 09:08:53 crc kubenswrapper[4848]: I0217 09:08:53.111693 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" event={"ID":"f546d757-f0f3-456c-b6bb-6e304b0bf8e5","Type":"ContainerStarted","Data":"f1437fe622ccef7f03a06b35eca62f85d0364591c0a49989883c4d29ad75bd62"} Feb 17 09:08:53 crc kubenswrapper[4848]: I0217 09:08:53.117508 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958" event={"ID":"a5563bca-6a52-4c54-956f-52350b8acb5c","Type":"ContainerStarted","Data":"52187df71743a897f3a4048b235bda9dfaeb95625ce04327e7e0e164ed5cc0aa"} Feb 17 09:08:53 crc kubenswrapper[4848]: I0217 09:08:53.120498 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" Feb 17 09:08:53 crc kubenswrapper[4848]: I0217 09:08:53.132740 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" podStartSLOduration=3.1327204220000002 podStartE2EDuration="3.132720422s" podCreationTimestamp="2026-02-17 09:08:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:08:53.13262984 +0000 UTC m=+210.675885506" watchObservedRunningTime="2026-02-17 09:08:53.132720422 +0000 UTC m=+210.675976068" Feb 17 09:08:53 crc kubenswrapper[4848]: I0217 09:08:53.392108 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f59a781-9159-4243-9b47-33388c25af00" path="/var/lib/kubelet/pods/2f59a781-9159-4243-9b47-33388c25af00/volumes" Feb 17 09:08:53 crc kubenswrapper[4848]: I0217 09:08:53.392912 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1a27259-110e-41d2-b8a8-319c784975a6" path="/var/lib/kubelet/pods/f1a27259-110e-41d2-b8a8-319c784975a6/volumes" Feb 17 09:08:54 crc kubenswrapper[4848]: I0217 09:08:54.127567 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958" event={"ID":"a5563bca-6a52-4c54-956f-52350b8acb5c","Type":"ContainerStarted","Data":"d4a4bce4d9a008f93559bead769e80155e1bc4a0282ea23e8266cd19751ea42c"} Feb 17 09:08:54 crc kubenswrapper[4848]: I0217 09:08:54.156320 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958" podStartSLOduration=4.156289345 podStartE2EDuration="4.156289345s" podCreationTimestamp="2026-02-17 09:08:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:08:54.152152656 +0000 UTC m=+211.695408312" watchObservedRunningTime="2026-02-17 09:08:54.156289345 +0000 UTC m=+211.699545031" Feb 17 09:08:55 crc kubenswrapper[4848]: I0217 09:08:55.137156 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958" Feb 17 09:08:55 crc kubenswrapper[4848]: I0217 09:08:55.143164 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.045965 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" podUID="1d8cdbb3-b672-4984-8d03-562965a7b081" containerName="oauth-openshift" containerID="cri-o://02c86f1238b5420acb6bc6a0beaaa954c362725e9545d50692e42c77b68351b0" gracePeriod=15 Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.505846 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.538542 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-user-template-provider-selection\") pod \"1d8cdbb3-b672-4984-8d03-562965a7b081\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.538643 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tsjg\" (UniqueName: \"kubernetes.io/projected/1d8cdbb3-b672-4984-8d03-562965a7b081-kube-api-access-5tsjg\") pod \"1d8cdbb3-b672-4984-8d03-562965a7b081\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.538711 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-user-idp-0-file-data\") pod \"1d8cdbb3-b672-4984-8d03-562965a7b081\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.538744 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-trusted-ca-bundle\") pod \"1d8cdbb3-b672-4984-8d03-562965a7b081\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.538825 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-user-template-error\") pod \"1d8cdbb3-b672-4984-8d03-562965a7b081\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.538861 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-session\") pod \"1d8cdbb3-b672-4984-8d03-562965a7b081\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.538911 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-user-template-login\") pod \"1d8cdbb3-b672-4984-8d03-562965a7b081\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.538940 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-router-certs\") pod \"1d8cdbb3-b672-4984-8d03-562965a7b081\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.538982 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-cliconfig\") pod \"1d8cdbb3-b672-4984-8d03-562965a7b081\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.539019 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-service-ca\") pod \"1d8cdbb3-b672-4984-8d03-562965a7b081\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.539083 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-ocp-branding-template\") pod \"1d8cdbb3-b672-4984-8d03-562965a7b081\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.539136 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-serving-cert\") pod \"1d8cdbb3-b672-4984-8d03-562965a7b081\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.539186 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d8cdbb3-b672-4984-8d03-562965a7b081-audit-dir\") pod \"1d8cdbb3-b672-4984-8d03-562965a7b081\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.539248 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1d8cdbb3-b672-4984-8d03-562965a7b081-audit-policies\") pod \"1d8cdbb3-b672-4984-8d03-562965a7b081\" (UID: \"1d8cdbb3-b672-4984-8d03-562965a7b081\") " Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.540004 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d8cdbb3-b672-4984-8d03-562965a7b081-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "1d8cdbb3-b672-4984-8d03-562965a7b081" (UID: "1d8cdbb3-b672-4984-8d03-562965a7b081"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.541036 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d8cdbb3-b672-4984-8d03-562965a7b081-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "1d8cdbb3-b672-4984-8d03-562965a7b081" (UID: "1d8cdbb3-b672-4984-8d03-562965a7b081"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.541206 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "1d8cdbb3-b672-4984-8d03-562965a7b081" (UID: "1d8cdbb3-b672-4984-8d03-562965a7b081"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.542962 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "1d8cdbb3-b672-4984-8d03-562965a7b081" (UID: "1d8cdbb3-b672-4984-8d03-562965a7b081"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.545570 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "1d8cdbb3-b672-4984-8d03-562965a7b081" (UID: "1d8cdbb3-b672-4984-8d03-562965a7b081"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.548892 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "1d8cdbb3-b672-4984-8d03-562965a7b081" (UID: "1d8cdbb3-b672-4984-8d03-562965a7b081"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.549991 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "1d8cdbb3-b672-4984-8d03-562965a7b081" (UID: "1d8cdbb3-b672-4984-8d03-562965a7b081"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.550482 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "1d8cdbb3-b672-4984-8d03-562965a7b081" (UID: "1d8cdbb3-b672-4984-8d03-562965a7b081"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.550840 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "1d8cdbb3-b672-4984-8d03-562965a7b081" (UID: "1d8cdbb3-b672-4984-8d03-562965a7b081"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.551024 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "1d8cdbb3-b672-4984-8d03-562965a7b081" (UID: "1d8cdbb3-b672-4984-8d03-562965a7b081"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.551430 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "1d8cdbb3-b672-4984-8d03-562965a7b081" (UID: "1d8cdbb3-b672-4984-8d03-562965a7b081"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.552186 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "1d8cdbb3-b672-4984-8d03-562965a7b081" (UID: "1d8cdbb3-b672-4984-8d03-562965a7b081"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.552036 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d8cdbb3-b672-4984-8d03-562965a7b081-kube-api-access-5tsjg" (OuterVolumeSpecName: "kube-api-access-5tsjg") pod "1d8cdbb3-b672-4984-8d03-562965a7b081" (UID: "1d8cdbb3-b672-4984-8d03-562965a7b081"). InnerVolumeSpecName "kube-api-access-5tsjg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.554825 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "1d8cdbb3-b672-4984-8d03-562965a7b081" (UID: "1d8cdbb3-b672-4984-8d03-562965a7b081"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.640395 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.640892 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.640906 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.640918 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.640928 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.640937 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.640948 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.640958 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.640967 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.640978 4848 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1d8cdbb3-b672-4984-8d03-562965a7b081-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.640990 4848 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1d8cdbb3-b672-4984-8d03-562965a7b081-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.640999 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.641009 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tsjg\" (UniqueName: \"kubernetes.io/projected/1d8cdbb3-b672-4984-8d03-562965a7b081-kube-api-access-5tsjg\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:56 crc kubenswrapper[4848]: I0217 09:08:56.641018 4848 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1d8cdbb3-b672-4984-8d03-562965a7b081-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.130909 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-655cc67ff8-pbwvn"] Feb 17 09:08:57 crc kubenswrapper[4848]: E0217 09:08:57.131918 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d8cdbb3-b672-4984-8d03-562965a7b081" containerName="oauth-openshift" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.131954 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d8cdbb3-b672-4984-8d03-562965a7b081" containerName="oauth-openshift" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.132556 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d8cdbb3-b672-4984-8d03-562965a7b081" containerName="oauth-openshift" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.133839 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.160159 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-655cc67ff8-pbwvn"] Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.174008 4848 generic.go:334] "Generic (PLEG): container finished" podID="1d8cdbb3-b672-4984-8d03-562965a7b081" containerID="02c86f1238b5420acb6bc6a0beaaa954c362725e9545d50692e42c77b68351b0" exitCode=0 Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.174173 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.174206 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" event={"ID":"1d8cdbb3-b672-4984-8d03-562965a7b081","Type":"ContainerDied","Data":"02c86f1238b5420acb6bc6a0beaaa954c362725e9545d50692e42c77b68351b0"} Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.174236 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xzdww" event={"ID":"1d8cdbb3-b672-4984-8d03-562965a7b081","Type":"ContainerDied","Data":"22d757470f1e9e3324bd6a5f6d590599abe9d7b44654447975e74220d7327564"} Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.174256 4848 scope.go:117] "RemoveContainer" containerID="02c86f1238b5420acb6bc6a0beaaa954c362725e9545d50692e42c77b68351b0" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.213059 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xzdww"] Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.213737 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xzdww"] Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.218303 4848 scope.go:117] "RemoveContainer" containerID="02c86f1238b5420acb6bc6a0beaaa954c362725e9545d50692e42c77b68351b0" Feb 17 09:08:57 crc kubenswrapper[4848]: E0217 09:08:57.218708 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02c86f1238b5420acb6bc6a0beaaa954c362725e9545d50692e42c77b68351b0\": container with ID starting with 02c86f1238b5420acb6bc6a0beaaa954c362725e9545d50692e42c77b68351b0 not found: ID does not exist" containerID="02c86f1238b5420acb6bc6a0beaaa954c362725e9545d50692e42c77b68351b0" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.218742 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02c86f1238b5420acb6bc6a0beaaa954c362725e9545d50692e42c77b68351b0"} err="failed to get container status \"02c86f1238b5420acb6bc6a0beaaa954c362725e9545d50692e42c77b68351b0\": rpc error: code = NotFound desc = could not find container \"02c86f1238b5420acb6bc6a0beaaa954c362725e9545d50692e42c77b68351b0\": container with ID starting with 02c86f1238b5420acb6bc6a0beaaa954c362725e9545d50692e42c77b68351b0 not found: ID does not exist" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.256434 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.256475 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-user-template-error\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.256503 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4be519d1-a9f9-4b5c-abb3-609ebc24f718-audit-dir\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.256564 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-system-cliconfig\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.256689 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.257253 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.257351 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-system-service-ca\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.257448 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-user-template-login\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.257490 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-system-serving-cert\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.257526 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-system-session\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.257580 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-system-router-certs\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.257623 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4be519d1-a9f9-4b5c-abb3-609ebc24f718-audit-policies\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.257692 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.257715 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65n2t\" (UniqueName: \"kubernetes.io/projected/4be519d1-a9f9-4b5c-abb3-609ebc24f718-kube-api-access-65n2t\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.359972 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.360024 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-user-template-error\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.360050 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4be519d1-a9f9-4b5c-abb3-609ebc24f718-audit-dir\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.360080 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.360107 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-system-cliconfig\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.360158 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.360187 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-system-service-ca\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.360233 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-user-template-login\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.360262 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-system-serving-cert\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.360263 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4be519d1-a9f9-4b5c-abb3-609ebc24f718-audit-dir\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.360284 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-system-session\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.360431 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-system-router-certs\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.360486 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4be519d1-a9f9-4b5c-abb3-609ebc24f718-audit-policies\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.360540 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.360579 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65n2t\" (UniqueName: \"kubernetes.io/projected/4be519d1-a9f9-4b5c-abb3-609ebc24f718-kube-api-access-65n2t\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.361590 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4be519d1-a9f9-4b5c-abb3-609ebc24f718-audit-policies\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.361749 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-system-service-ca\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.361803 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.362369 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-system-cliconfig\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.365581 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-user-template-error\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.366437 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-system-router-certs\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.367196 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.367408 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.368205 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-user-template-login\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.368669 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-system-serving-cert\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.372176 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.373108 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4be519d1-a9f9-4b5c-abb3-609ebc24f718-v4-0-config-system-session\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.379643 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65n2t\" (UniqueName: \"kubernetes.io/projected/4be519d1-a9f9-4b5c-abb3-609ebc24f718-kube-api-access-65n2t\") pod \"oauth-openshift-655cc67ff8-pbwvn\" (UID: \"4be519d1-a9f9-4b5c-abb3-609ebc24f718\") " pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.391664 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d8cdbb3-b672-4984-8d03-562965a7b081" path="/var/lib/kubelet/pods/1d8cdbb3-b672-4984-8d03-562965a7b081/volumes" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.473411 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:57 crc kubenswrapper[4848]: I0217 09:08:57.924960 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-655cc67ff8-pbwvn"] Feb 17 09:08:58 crc kubenswrapper[4848]: I0217 09:08:58.188684 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" event={"ID":"4be519d1-a9f9-4b5c-abb3-609ebc24f718","Type":"ContainerStarted","Data":"9ad91a4a1934539de5acc7a60c5f0d69de2666d3c290ca0c9b23778e5dd68728"} Feb 17 09:08:59 crc kubenswrapper[4848]: I0217 09:08:59.196777 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" event={"ID":"4be519d1-a9f9-4b5c-abb3-609ebc24f718","Type":"ContainerStarted","Data":"aa83858ea034f43967521b9e8b57078d38794db5d79eb4d93f05617a0b15de80"} Feb 17 09:08:59 crc kubenswrapper[4848]: I0217 09:08:59.197411 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:59 crc kubenswrapper[4848]: I0217 09:08:59.202078 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" Feb 17 09:08:59 crc kubenswrapper[4848]: I0217 09:08:59.222295 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-655cc67ff8-pbwvn" podStartSLOduration=28.222272226 podStartE2EDuration="28.222272226s" podCreationTimestamp="2026-02-17 09:08:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:08:59.221337679 +0000 UTC m=+216.764593335" watchObservedRunningTime="2026-02-17 09:08:59.222272226 +0000 UTC m=+216.765527882" Feb 17 09:09:10 crc kubenswrapper[4848]: I0217 09:09:10.735355 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d4744b6dc-djvhk"] Feb 17 09:09:10 crc kubenswrapper[4848]: I0217 09:09:10.736218 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" podUID="f546d757-f0f3-456c-b6bb-6e304b0bf8e5" containerName="controller-manager" containerID="cri-o://a66025d95ef652dc0effe9a52b657fb89de697b9b14a7945d6f248709513c903" gracePeriod=30 Feb 17 09:09:10 crc kubenswrapper[4848]: I0217 09:09:10.809215 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958"] Feb 17 09:09:10 crc kubenswrapper[4848]: I0217 09:09:10.809449 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958" podUID="a5563bca-6a52-4c54-956f-52350b8acb5c" containerName="route-controller-manager" containerID="cri-o://d4a4bce4d9a008f93559bead769e80155e1bc4a0282ea23e8266cd19751ea42c" gracePeriod=30 Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.269979 4848 generic.go:334] "Generic (PLEG): container finished" podID="f546d757-f0f3-456c-b6bb-6e304b0bf8e5" containerID="a66025d95ef652dc0effe9a52b657fb89de697b9b14a7945d6f248709513c903" exitCode=0 Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.270086 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" event={"ID":"f546d757-f0f3-456c-b6bb-6e304b0bf8e5","Type":"ContainerDied","Data":"a66025d95ef652dc0effe9a52b657fb89de697b9b14a7945d6f248709513c903"} Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.271510 4848 generic.go:334] "Generic (PLEG): container finished" podID="a5563bca-6a52-4c54-956f-52350b8acb5c" containerID="d4a4bce4d9a008f93559bead769e80155e1bc4a0282ea23e8266cd19751ea42c" exitCode=0 Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.271543 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958" event={"ID":"a5563bca-6a52-4c54-956f-52350b8acb5c","Type":"ContainerDied","Data":"d4a4bce4d9a008f93559bead769e80155e1bc4a0282ea23e8266cd19751ea42c"} Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.271567 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958" event={"ID":"a5563bca-6a52-4c54-956f-52350b8acb5c","Type":"ContainerDied","Data":"52187df71743a897f3a4048b235bda9dfaeb95625ce04327e7e0e164ed5cc0aa"} Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.271581 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52187df71743a897f3a4048b235bda9dfaeb95625ce04327e7e0e164ed5cc0aa" Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.281601 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958" Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.361325 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.361911 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5563bca-6a52-4c54-956f-52350b8acb5c-client-ca\") pod \"a5563bca-6a52-4c54-956f-52350b8acb5c\" (UID: \"a5563bca-6a52-4c54-956f-52350b8acb5c\") " Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.361956 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cw69\" (UniqueName: \"kubernetes.io/projected/a5563bca-6a52-4c54-956f-52350b8acb5c-kube-api-access-8cw69\") pod \"a5563bca-6a52-4c54-956f-52350b8acb5c\" (UID: \"a5563bca-6a52-4c54-956f-52350b8acb5c\") " Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.362044 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5563bca-6a52-4c54-956f-52350b8acb5c-serving-cert\") pod \"a5563bca-6a52-4c54-956f-52350b8acb5c\" (UID: \"a5563bca-6a52-4c54-956f-52350b8acb5c\") " Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.362075 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5563bca-6a52-4c54-956f-52350b8acb5c-config\") pod \"a5563bca-6a52-4c54-956f-52350b8acb5c\" (UID: \"a5563bca-6a52-4c54-956f-52350b8acb5c\") " Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.362888 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5563bca-6a52-4c54-956f-52350b8acb5c-client-ca" (OuterVolumeSpecName: "client-ca") pod "a5563bca-6a52-4c54-956f-52350b8acb5c" (UID: "a5563bca-6a52-4c54-956f-52350b8acb5c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.362910 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5563bca-6a52-4c54-956f-52350b8acb5c-config" (OuterVolumeSpecName: "config") pod "a5563bca-6a52-4c54-956f-52350b8acb5c" (UID: "a5563bca-6a52-4c54-956f-52350b8acb5c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.368595 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5563bca-6a52-4c54-956f-52350b8acb5c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a5563bca-6a52-4c54-956f-52350b8acb5c" (UID: "a5563bca-6a52-4c54-956f-52350b8acb5c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.368950 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5563bca-6a52-4c54-956f-52350b8acb5c-kube-api-access-8cw69" (OuterVolumeSpecName: "kube-api-access-8cw69") pod "a5563bca-6a52-4c54-956f-52350b8acb5c" (UID: "a5563bca-6a52-4c54-956f-52350b8acb5c"). InnerVolumeSpecName "kube-api-access-8cw69". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.463615 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-serving-cert\") pod \"f546d757-f0f3-456c-b6bb-6e304b0bf8e5\" (UID: \"f546d757-f0f3-456c-b6bb-6e304b0bf8e5\") " Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.463678 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-config\") pod \"f546d757-f0f3-456c-b6bb-6e304b0bf8e5\" (UID: \"f546d757-f0f3-456c-b6bb-6e304b0bf8e5\") " Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.463740 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qgv8\" (UniqueName: \"kubernetes.io/projected/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-kube-api-access-9qgv8\") pod \"f546d757-f0f3-456c-b6bb-6e304b0bf8e5\" (UID: \"f546d757-f0f3-456c-b6bb-6e304b0bf8e5\") " Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.463783 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-proxy-ca-bundles\") pod \"f546d757-f0f3-456c-b6bb-6e304b0bf8e5\" (UID: \"f546d757-f0f3-456c-b6bb-6e304b0bf8e5\") " Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.463808 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-client-ca\") pod \"f546d757-f0f3-456c-b6bb-6e304b0bf8e5\" (UID: \"f546d757-f0f3-456c-b6bb-6e304b0bf8e5\") " Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.464098 4848 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a5563bca-6a52-4c54-956f-52350b8acb5c-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.464115 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cw69\" (UniqueName: \"kubernetes.io/projected/a5563bca-6a52-4c54-956f-52350b8acb5c-kube-api-access-8cw69\") on node \"crc\" DevicePath \"\"" Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.464125 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a5563bca-6a52-4c54-956f-52350b8acb5c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.464135 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5563bca-6a52-4c54-956f-52350b8acb5c-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.465498 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f546d757-f0f3-456c-b6bb-6e304b0bf8e5" (UID: "f546d757-f0f3-456c-b6bb-6e304b0bf8e5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.465611 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-config" (OuterVolumeSpecName: "config") pod "f546d757-f0f3-456c-b6bb-6e304b0bf8e5" (UID: "f546d757-f0f3-456c-b6bb-6e304b0bf8e5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.465735 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-client-ca" (OuterVolumeSpecName: "client-ca") pod "f546d757-f0f3-456c-b6bb-6e304b0bf8e5" (UID: "f546d757-f0f3-456c-b6bb-6e304b0bf8e5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.467910 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-kube-api-access-9qgv8" (OuterVolumeSpecName: "kube-api-access-9qgv8") pod "f546d757-f0f3-456c-b6bb-6e304b0bf8e5" (UID: "f546d757-f0f3-456c-b6bb-6e304b0bf8e5"). InnerVolumeSpecName "kube-api-access-9qgv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.468039 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f546d757-f0f3-456c-b6bb-6e304b0bf8e5" (UID: "f546d757-f0f3-456c-b6bb-6e304b0bf8e5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.565008 4848 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.565045 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.565056 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qgv8\" (UniqueName: \"kubernetes.io/projected/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-kube-api-access-9qgv8\") on node \"crc\" DevicePath \"\"" Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.565065 4848 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 17 09:09:11 crc kubenswrapper[4848]: I0217 09:09:11.565073 4848 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f546d757-f0f3-456c-b6bb-6e304b0bf8e5-client-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.136574 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-57f7df9475-22676"] Feb 17 09:09:12 crc kubenswrapper[4848]: E0217 09:09:12.136969 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f546d757-f0f3-456c-b6bb-6e304b0bf8e5" containerName="controller-manager" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.136992 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f546d757-f0f3-456c-b6bb-6e304b0bf8e5" containerName="controller-manager" Feb 17 09:09:12 crc kubenswrapper[4848]: E0217 09:09:12.137025 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5563bca-6a52-4c54-956f-52350b8acb5c" containerName="route-controller-manager" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.137037 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5563bca-6a52-4c54-956f-52350b8acb5c" containerName="route-controller-manager" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.137216 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5563bca-6a52-4c54-956f-52350b8acb5c" containerName="route-controller-manager" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.137238 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f546d757-f0f3-456c-b6bb-6e304b0bf8e5" containerName="controller-manager" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.137834 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57f7df9475-22676" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.143195 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-694d7546d8-2j7sk"] Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.144173 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-694d7546d8-2j7sk" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.151381 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57f7df9475-22676"] Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.156547 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-694d7546d8-2j7sk"] Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.275470 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7f35653-d13c-45d2-a917-8793c7b7c75a-client-ca\") pod \"controller-manager-57f7df9475-22676\" (UID: \"f7f35653-d13c-45d2-a917-8793c7b7c75a\") " pod="openshift-controller-manager/controller-manager-57f7df9475-22676" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.275561 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvvks\" (UniqueName: \"kubernetes.io/projected/f7f35653-d13c-45d2-a917-8793c7b7c75a-kube-api-access-cvvks\") pod \"controller-manager-57f7df9475-22676\" (UID: \"f7f35653-d13c-45d2-a917-8793c7b7c75a\") " pod="openshift-controller-manager/controller-manager-57f7df9475-22676" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.275601 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7f35653-d13c-45d2-a917-8793c7b7c75a-config\") pod \"controller-manager-57f7df9475-22676\" (UID: \"f7f35653-d13c-45d2-a917-8793c7b7c75a\") " pod="openshift-controller-manager/controller-manager-57f7df9475-22676" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.275630 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm8gg\" (UniqueName: \"kubernetes.io/projected/c18ff923-0de4-4b7c-9f8b-5e4585b57a82-kube-api-access-gm8gg\") pod \"route-controller-manager-694d7546d8-2j7sk\" (UID: \"c18ff923-0de4-4b7c-9f8b-5e4585b57a82\") " pod="openshift-route-controller-manager/route-controller-manager-694d7546d8-2j7sk" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.275669 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7f35653-d13c-45d2-a917-8793c7b7c75a-serving-cert\") pod \"controller-manager-57f7df9475-22676\" (UID: \"f7f35653-d13c-45d2-a917-8793c7b7c75a\") " pod="openshift-controller-manager/controller-manager-57f7df9475-22676" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.275710 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c18ff923-0de4-4b7c-9f8b-5e4585b57a82-config\") pod \"route-controller-manager-694d7546d8-2j7sk\" (UID: \"c18ff923-0de4-4b7c-9f8b-5e4585b57a82\") " pod="openshift-route-controller-manager/route-controller-manager-694d7546d8-2j7sk" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.275738 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c18ff923-0de4-4b7c-9f8b-5e4585b57a82-serving-cert\") pod \"route-controller-manager-694d7546d8-2j7sk\" (UID: \"c18ff923-0de4-4b7c-9f8b-5e4585b57a82\") " pod="openshift-route-controller-manager/route-controller-manager-694d7546d8-2j7sk" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.275815 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f7f35653-d13c-45d2-a917-8793c7b7c75a-proxy-ca-bundles\") pod \"controller-manager-57f7df9475-22676\" (UID: \"f7f35653-d13c-45d2-a917-8793c7b7c75a\") " pod="openshift-controller-manager/controller-manager-57f7df9475-22676" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.275884 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c18ff923-0de4-4b7c-9f8b-5e4585b57a82-client-ca\") pod \"route-controller-manager-694d7546d8-2j7sk\" (UID: \"c18ff923-0de4-4b7c-9f8b-5e4585b57a82\") " pod="openshift-route-controller-manager/route-controller-manager-694d7546d8-2j7sk" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.279526 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.279999 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.279988 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6d4744b6dc-djvhk" event={"ID":"f546d757-f0f3-456c-b6bb-6e304b0bf8e5","Type":"ContainerDied","Data":"f1437fe622ccef7f03a06b35eca62f85d0364591c0a49989883c4d29ad75bd62"} Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.280073 4848 scope.go:117] "RemoveContainer" containerID="a66025d95ef652dc0effe9a52b657fb89de697b9b14a7945d6f248709513c903" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.319041 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958"] Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.325703 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c89cdd98b-8p958"] Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.330876 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6d4744b6dc-djvhk"] Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.334750 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6d4744b6dc-djvhk"] Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.376861 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c18ff923-0de4-4b7c-9f8b-5e4585b57a82-client-ca\") pod \"route-controller-manager-694d7546d8-2j7sk\" (UID: \"c18ff923-0de4-4b7c-9f8b-5e4585b57a82\") " pod="openshift-route-controller-manager/route-controller-manager-694d7546d8-2j7sk" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.376912 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7f35653-d13c-45d2-a917-8793c7b7c75a-client-ca\") pod \"controller-manager-57f7df9475-22676\" (UID: \"f7f35653-d13c-45d2-a917-8793c7b7c75a\") " pod="openshift-controller-manager/controller-manager-57f7df9475-22676" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.376970 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvvks\" (UniqueName: \"kubernetes.io/projected/f7f35653-d13c-45d2-a917-8793c7b7c75a-kube-api-access-cvvks\") pod \"controller-manager-57f7df9475-22676\" (UID: \"f7f35653-d13c-45d2-a917-8793c7b7c75a\") " pod="openshift-controller-manager/controller-manager-57f7df9475-22676" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.376994 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7f35653-d13c-45d2-a917-8793c7b7c75a-config\") pod \"controller-manager-57f7df9475-22676\" (UID: \"f7f35653-d13c-45d2-a917-8793c7b7c75a\") " pod="openshift-controller-manager/controller-manager-57f7df9475-22676" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.377016 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm8gg\" (UniqueName: \"kubernetes.io/projected/c18ff923-0de4-4b7c-9f8b-5e4585b57a82-kube-api-access-gm8gg\") pod \"route-controller-manager-694d7546d8-2j7sk\" (UID: \"c18ff923-0de4-4b7c-9f8b-5e4585b57a82\") " pod="openshift-route-controller-manager/route-controller-manager-694d7546d8-2j7sk" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.377043 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7f35653-d13c-45d2-a917-8793c7b7c75a-serving-cert\") pod \"controller-manager-57f7df9475-22676\" (UID: \"f7f35653-d13c-45d2-a917-8793c7b7c75a\") " pod="openshift-controller-manager/controller-manager-57f7df9475-22676" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.377070 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c18ff923-0de4-4b7c-9f8b-5e4585b57a82-config\") pod \"route-controller-manager-694d7546d8-2j7sk\" (UID: \"c18ff923-0de4-4b7c-9f8b-5e4585b57a82\") " pod="openshift-route-controller-manager/route-controller-manager-694d7546d8-2j7sk" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.377089 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c18ff923-0de4-4b7c-9f8b-5e4585b57a82-serving-cert\") pod \"route-controller-manager-694d7546d8-2j7sk\" (UID: \"c18ff923-0de4-4b7c-9f8b-5e4585b57a82\") " pod="openshift-route-controller-manager/route-controller-manager-694d7546d8-2j7sk" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.377110 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f7f35653-d13c-45d2-a917-8793c7b7c75a-proxy-ca-bundles\") pod \"controller-manager-57f7df9475-22676\" (UID: \"f7f35653-d13c-45d2-a917-8793c7b7c75a\") " pod="openshift-controller-manager/controller-manager-57f7df9475-22676" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.378387 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c18ff923-0de4-4b7c-9f8b-5e4585b57a82-client-ca\") pod \"route-controller-manager-694d7546d8-2j7sk\" (UID: \"c18ff923-0de4-4b7c-9f8b-5e4585b57a82\") " pod="openshift-route-controller-manager/route-controller-manager-694d7546d8-2j7sk" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.378558 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c18ff923-0de4-4b7c-9f8b-5e4585b57a82-config\") pod \"route-controller-manager-694d7546d8-2j7sk\" (UID: \"c18ff923-0de4-4b7c-9f8b-5e4585b57a82\") " pod="openshift-route-controller-manager/route-controller-manager-694d7546d8-2j7sk" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.380273 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7f35653-d13c-45d2-a917-8793c7b7c75a-config\") pod \"controller-manager-57f7df9475-22676\" (UID: \"f7f35653-d13c-45d2-a917-8793c7b7c75a\") " pod="openshift-controller-manager/controller-manager-57f7df9475-22676" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.380845 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f7f35653-d13c-45d2-a917-8793c7b7c75a-proxy-ca-bundles\") pod \"controller-manager-57f7df9475-22676\" (UID: \"f7f35653-d13c-45d2-a917-8793c7b7c75a\") " pod="openshift-controller-manager/controller-manager-57f7df9475-22676" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.382109 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7f35653-d13c-45d2-a917-8793c7b7c75a-client-ca\") pod \"controller-manager-57f7df9475-22676\" (UID: \"f7f35653-d13c-45d2-a917-8793c7b7c75a\") " pod="openshift-controller-manager/controller-manager-57f7df9475-22676" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.382635 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c18ff923-0de4-4b7c-9f8b-5e4585b57a82-serving-cert\") pod \"route-controller-manager-694d7546d8-2j7sk\" (UID: \"c18ff923-0de4-4b7c-9f8b-5e4585b57a82\") " pod="openshift-route-controller-manager/route-controller-manager-694d7546d8-2j7sk" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.387575 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7f35653-d13c-45d2-a917-8793c7b7c75a-serving-cert\") pod \"controller-manager-57f7df9475-22676\" (UID: \"f7f35653-d13c-45d2-a917-8793c7b7c75a\") " pod="openshift-controller-manager/controller-manager-57f7df9475-22676" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.398466 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvvks\" (UniqueName: \"kubernetes.io/projected/f7f35653-d13c-45d2-a917-8793c7b7c75a-kube-api-access-cvvks\") pod \"controller-manager-57f7df9475-22676\" (UID: \"f7f35653-d13c-45d2-a917-8793c7b7c75a\") " pod="openshift-controller-manager/controller-manager-57f7df9475-22676" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.401819 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm8gg\" (UniqueName: \"kubernetes.io/projected/c18ff923-0de4-4b7c-9f8b-5e4585b57a82-kube-api-access-gm8gg\") pod \"route-controller-manager-694d7546d8-2j7sk\" (UID: \"c18ff923-0de4-4b7c-9f8b-5e4585b57a82\") " pod="openshift-route-controller-manager/route-controller-manager-694d7546d8-2j7sk" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.495842 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57f7df9475-22676" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.510521 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-694d7546d8-2j7sk" Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.822136 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-694d7546d8-2j7sk"] Feb 17 09:09:12 crc kubenswrapper[4848]: I0217 09:09:12.982334 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57f7df9475-22676"] Feb 17 09:09:12 crc kubenswrapper[4848]: W0217 09:09:12.984997 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7f35653_d13c_45d2_a917_8793c7b7c75a.slice/crio-64ef6f9becdc4e96e313fec35e15016b86aed556df4c7a2acd2c547e53de7aca WatchSource:0}: Error finding container 64ef6f9becdc4e96e313fec35e15016b86aed556df4c7a2acd2c547e53de7aca: Status 404 returned error can't find the container with id 64ef6f9becdc4e96e313fec35e15016b86aed556df4c7a2acd2c547e53de7aca Feb 17 09:09:13 crc kubenswrapper[4848]: I0217 09:09:13.288711 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-694d7546d8-2j7sk" event={"ID":"c18ff923-0de4-4b7c-9f8b-5e4585b57a82","Type":"ContainerStarted","Data":"a99b39d4106e66c6c7f546e73ff243f9bad953bfa85132355fa696ec531679e0"} Feb 17 09:09:13 crc kubenswrapper[4848]: I0217 09:09:13.288748 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-694d7546d8-2j7sk" event={"ID":"c18ff923-0de4-4b7c-9f8b-5e4585b57a82","Type":"ContainerStarted","Data":"5d54649cf6c66c55acae907dbfc36c36a524abdbed4bf8c77a01320cdef157c8"} Feb 17 09:09:13 crc kubenswrapper[4848]: I0217 09:09:13.289703 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-694d7546d8-2j7sk" Feb 17 09:09:13 crc kubenswrapper[4848]: I0217 09:09:13.292438 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57f7df9475-22676" event={"ID":"f7f35653-d13c-45d2-a917-8793c7b7c75a","Type":"ContainerStarted","Data":"8a5097a28b655f92e2bd9ba54dae3eb9d4d1db27d8fdb034f1cbc88e93b698bd"} Feb 17 09:09:13 crc kubenswrapper[4848]: I0217 09:09:13.292506 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-57f7df9475-22676" event={"ID":"f7f35653-d13c-45d2-a917-8793c7b7c75a","Type":"ContainerStarted","Data":"64ef6f9becdc4e96e313fec35e15016b86aed556df4c7a2acd2c547e53de7aca"} Feb 17 09:09:13 crc kubenswrapper[4848]: I0217 09:09:13.292687 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-57f7df9475-22676" Feb 17 09:09:13 crc kubenswrapper[4848]: I0217 09:09:13.298797 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-57f7df9475-22676" Feb 17 09:09:13 crc kubenswrapper[4848]: I0217 09:09:13.305155 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-694d7546d8-2j7sk" podStartSLOduration=3.305139522 podStartE2EDuration="3.305139522s" podCreationTimestamp="2026-02-17 09:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:09:13.304332248 +0000 UTC m=+230.847587904" watchObservedRunningTime="2026-02-17 09:09:13.305139522 +0000 UTC m=+230.848395188" Feb 17 09:09:13 crc kubenswrapper[4848]: I0217 09:09:13.324536 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-57f7df9475-22676" podStartSLOduration=3.324518229 podStartE2EDuration="3.324518229s" podCreationTimestamp="2026-02-17 09:09:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:09:13.320207195 +0000 UTC m=+230.863462861" watchObservedRunningTime="2026-02-17 09:09:13.324518229 +0000 UTC m=+230.867773895" Feb 17 09:09:13 crc kubenswrapper[4848]: I0217 09:09:13.403790 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5563bca-6a52-4c54-956f-52350b8acb5c" path="/var/lib/kubelet/pods/a5563bca-6a52-4c54-956f-52350b8acb5c/volumes" Feb 17 09:09:13 crc kubenswrapper[4848]: I0217 09:09:13.404541 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f546d757-f0f3-456c-b6bb-6e304b0bf8e5" path="/var/lib/kubelet/pods/f546d757-f0f3-456c-b6bb-6e304b0bf8e5/volumes" Feb 17 09:09:13 crc kubenswrapper[4848]: I0217 09:09:13.445807 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-694d7546d8-2j7sk" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.107808 4848 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.109596 4848 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.109632 4848 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 09:09:17 crc kubenswrapper[4848]: E0217 09:09:17.109962 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.109982 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 09:09:17 crc kubenswrapper[4848]: E0217 09:09:17.109993 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.110001 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 09:09:17 crc kubenswrapper[4848]: E0217 09:09:17.110014 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.110025 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 09:09:17 crc kubenswrapper[4848]: E0217 09:09:17.110040 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.110047 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 17 09:09:17 crc kubenswrapper[4848]: E0217 09:09:17.110058 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.110066 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 09:09:17 crc kubenswrapper[4848]: E0217 09:09:17.110078 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.110087 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 09:09:17 crc kubenswrapper[4848]: E0217 09:09:17.110096 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.110103 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.111601 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.111741 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.111873 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.111896 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.111937 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.111998 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.112550 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.115237 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6" gracePeriod=15 Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.115997 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247" gracePeriod=15 Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.116456 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf" gracePeriod=15 Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.116924 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692" gracePeriod=15 Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.117243 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879" gracePeriod=15 Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.129356 4848 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.250481 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.250531 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.250558 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.250573 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.250606 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.250630 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.250649 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.250669 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.330009 4848 generic.go:334] "Generic (PLEG): container finished" podID="5822205e-65ea-4135-816e-f7ddedd77d77" containerID="02467ea22d8465dd5a45814c0f3f67241fdbbeee2e8ccc3411cbb1793392c28f" exitCode=0 Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.330079 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5822205e-65ea-4135-816e-f7ddedd77d77","Type":"ContainerDied","Data":"02467ea22d8465dd5a45814c0f3f67241fdbbeee2e8ccc3411cbb1793392c28f"} Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.330845 4848 status_manager.go:851] "Failed to get status for pod" podUID="5822205e-65ea-4135-816e-f7ddedd77d77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.332559 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.333858 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.334354 4848 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247" exitCode=0 Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.334372 4848 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf" exitCode=0 Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.334379 4848 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692" exitCode=0 Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.334386 4848 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879" exitCode=2 Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.334410 4848 scope.go:117] "RemoveContainer" containerID="9d1b52b4ff7783545bbf665385448388dabc3a36224b1169bc2c4068ecc045e7" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.351702 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.351780 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.351807 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.351870 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.351903 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.351930 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.351956 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.351982 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.352070 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.352120 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.352150 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.352178 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.352205 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.352230 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.352258 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 09:09:17 crc kubenswrapper[4848]: I0217 09:09:17.352286 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:09:18 crc kubenswrapper[4848]: I0217 09:09:18.344193 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 09:09:18 crc kubenswrapper[4848]: I0217 09:09:18.668148 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 09:09:18 crc kubenswrapper[4848]: I0217 09:09:18.668811 4848 status_manager.go:851] "Failed to get status for pod" podUID="5822205e-65ea-4135-816e-f7ddedd77d77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:18 crc kubenswrapper[4848]: I0217 09:09:18.768882 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5822205e-65ea-4135-816e-f7ddedd77d77-var-lock\") pod \"5822205e-65ea-4135-816e-f7ddedd77d77\" (UID: \"5822205e-65ea-4135-816e-f7ddedd77d77\") " Feb 17 09:09:18 crc kubenswrapper[4848]: I0217 09:09:18.769010 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5822205e-65ea-4135-816e-f7ddedd77d77-kubelet-dir\") pod \"5822205e-65ea-4135-816e-f7ddedd77d77\" (UID: \"5822205e-65ea-4135-816e-f7ddedd77d77\") " Feb 17 09:09:18 crc kubenswrapper[4848]: I0217 09:09:18.769024 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5822205e-65ea-4135-816e-f7ddedd77d77-var-lock" (OuterVolumeSpecName: "var-lock") pod "5822205e-65ea-4135-816e-f7ddedd77d77" (UID: "5822205e-65ea-4135-816e-f7ddedd77d77"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:09:18 crc kubenswrapper[4848]: I0217 09:09:18.769069 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5822205e-65ea-4135-816e-f7ddedd77d77-kube-api-access\") pod \"5822205e-65ea-4135-816e-f7ddedd77d77\" (UID: \"5822205e-65ea-4135-816e-f7ddedd77d77\") " Feb 17 09:09:18 crc kubenswrapper[4848]: I0217 09:09:18.769122 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5822205e-65ea-4135-816e-f7ddedd77d77-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5822205e-65ea-4135-816e-f7ddedd77d77" (UID: "5822205e-65ea-4135-816e-f7ddedd77d77"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:09:18 crc kubenswrapper[4848]: I0217 09:09:18.769313 4848 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5822205e-65ea-4135-816e-f7ddedd77d77-var-lock\") on node \"crc\" DevicePath \"\"" Feb 17 09:09:18 crc kubenswrapper[4848]: I0217 09:09:18.769332 4848 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5822205e-65ea-4135-816e-f7ddedd77d77-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 17 09:09:18 crc kubenswrapper[4848]: I0217 09:09:18.779953 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5822205e-65ea-4135-816e-f7ddedd77d77-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5822205e-65ea-4135-816e-f7ddedd77d77" (UID: "5822205e-65ea-4135-816e-f7ddedd77d77"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:09:18 crc kubenswrapper[4848]: I0217 09:09:18.870691 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5822205e-65ea-4135-816e-f7ddedd77d77-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 17 09:09:19 crc kubenswrapper[4848]: I0217 09:09:19.353738 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5822205e-65ea-4135-816e-f7ddedd77d77","Type":"ContainerDied","Data":"2cc043e6863cec491cd69ae0cae6289cc92d7ae24242e4995bf025684d8f427f"} Feb 17 09:09:19 crc kubenswrapper[4848]: I0217 09:09:19.354197 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cc043e6863cec491cd69ae0cae6289cc92d7ae24242e4995bf025684d8f427f" Feb 17 09:09:19 crc kubenswrapper[4848]: I0217 09:09:19.354265 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 17 09:09:19 crc kubenswrapper[4848]: I0217 09:09:19.366583 4848 status_manager.go:851] "Failed to get status for pod" podUID="5822205e-65ea-4135-816e-f7ddedd77d77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:19 crc kubenswrapper[4848]: I0217 09:09:19.495552 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 09:09:19 crc kubenswrapper[4848]: I0217 09:09:19.496412 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:09:19 crc kubenswrapper[4848]: I0217 09:09:19.497052 4848 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:19 crc kubenswrapper[4848]: I0217 09:09:19.497360 4848 status_manager.go:851] "Failed to get status for pod" podUID="5822205e-65ea-4135-816e-f7ddedd77d77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:19 crc kubenswrapper[4848]: I0217 09:09:19.578113 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 09:09:19 crc kubenswrapper[4848]: I0217 09:09:19.578192 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 09:09:19 crc kubenswrapper[4848]: I0217 09:09:19.578229 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 17 09:09:19 crc kubenswrapper[4848]: I0217 09:09:19.578292 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:09:19 crc kubenswrapper[4848]: I0217 09:09:19.578318 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:09:19 crc kubenswrapper[4848]: I0217 09:09:19.578374 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:09:19 crc kubenswrapper[4848]: I0217 09:09:19.578578 4848 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 09:09:19 crc kubenswrapper[4848]: I0217 09:09:19.578599 4848 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 17 09:09:19 crc kubenswrapper[4848]: I0217 09:09:19.578607 4848 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 17 09:09:20 crc kubenswrapper[4848]: I0217 09:09:20.361851 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 17 09:09:20 crc kubenswrapper[4848]: I0217 09:09:20.363831 4848 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6" exitCode=0 Feb 17 09:09:20 crc kubenswrapper[4848]: I0217 09:09:20.363879 4848 scope.go:117] "RemoveContainer" containerID="cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247" Feb 17 09:09:20 crc kubenswrapper[4848]: I0217 09:09:20.363985 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:09:20 crc kubenswrapper[4848]: I0217 09:09:20.376690 4848 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:20 crc kubenswrapper[4848]: I0217 09:09:20.376888 4848 status_manager.go:851] "Failed to get status for pod" podUID="5822205e-65ea-4135-816e-f7ddedd77d77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:20 crc kubenswrapper[4848]: I0217 09:09:20.379616 4848 scope.go:117] "RemoveContainer" containerID="1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf" Feb 17 09:09:20 crc kubenswrapper[4848]: I0217 09:09:20.395971 4848 scope.go:117] "RemoveContainer" containerID="e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692" Feb 17 09:09:20 crc kubenswrapper[4848]: I0217 09:09:20.410457 4848 scope.go:117] "RemoveContainer" containerID="40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879" Feb 17 09:09:20 crc kubenswrapper[4848]: I0217 09:09:20.427535 4848 scope.go:117] "RemoveContainer" containerID="bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6" Feb 17 09:09:20 crc kubenswrapper[4848]: I0217 09:09:20.441642 4848 scope.go:117] "RemoveContainer" containerID="1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0" Feb 17 09:09:20 crc kubenswrapper[4848]: I0217 09:09:20.471891 4848 scope.go:117] "RemoveContainer" containerID="cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247" Feb 17 09:09:20 crc kubenswrapper[4848]: E0217 09:09:20.472469 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\": container with ID starting with cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247 not found: ID does not exist" containerID="cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247" Feb 17 09:09:20 crc kubenswrapper[4848]: I0217 09:09:20.472520 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247"} err="failed to get container status \"cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\": rpc error: code = NotFound desc = could not find container \"cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247\": container with ID starting with cb3060a4b696c8ffcc2cb26b9e22056048bd507d44fad01459f7eca51f8bd247 not found: ID does not exist" Feb 17 09:09:20 crc kubenswrapper[4848]: I0217 09:09:20.472553 4848 scope.go:117] "RemoveContainer" containerID="1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf" Feb 17 09:09:20 crc kubenswrapper[4848]: E0217 09:09:20.472938 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\": container with ID starting with 1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf not found: ID does not exist" containerID="1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf" Feb 17 09:09:20 crc kubenswrapper[4848]: I0217 09:09:20.472968 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf"} err="failed to get container status \"1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\": rpc error: code = NotFound desc = could not find container \"1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf\": container with ID starting with 1d5f1a299f3a936173d4828667ec6735442a86d01be9f33257d67c6477057ddf not found: ID does not exist" Feb 17 09:09:20 crc kubenswrapper[4848]: I0217 09:09:20.472987 4848 scope.go:117] "RemoveContainer" containerID="e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692" Feb 17 09:09:20 crc kubenswrapper[4848]: E0217 09:09:20.473385 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\": container with ID starting with e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692 not found: ID does not exist" containerID="e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692" Feb 17 09:09:20 crc kubenswrapper[4848]: I0217 09:09:20.473429 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692"} err="failed to get container status \"e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\": rpc error: code = NotFound desc = could not find container \"e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692\": container with ID starting with e903adfdcfab5dd74617ce1b6a310352cdb8b1b4a8ef1fda9bb1d58c8994d692 not found: ID does not exist" Feb 17 09:09:20 crc kubenswrapper[4848]: I0217 09:09:20.473461 4848 scope.go:117] "RemoveContainer" containerID="40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879" Feb 17 09:09:20 crc kubenswrapper[4848]: E0217 09:09:20.473983 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\": container with ID starting with 40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879 not found: ID does not exist" containerID="40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879" Feb 17 09:09:20 crc kubenswrapper[4848]: I0217 09:09:20.474080 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879"} err="failed to get container status \"40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\": rpc error: code = NotFound desc = could not find container \"40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879\": container with ID starting with 40aef2a93523ad7e606b2af4336773cde3a0f27b229b50cd16fb9cdc4736c879 not found: ID does not exist" Feb 17 09:09:20 crc kubenswrapper[4848]: I0217 09:09:20.474106 4848 scope.go:117] "RemoveContainer" containerID="bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6" Feb 17 09:09:20 crc kubenswrapper[4848]: E0217 09:09:20.474566 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\": container with ID starting with bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6 not found: ID does not exist" containerID="bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6" Feb 17 09:09:20 crc kubenswrapper[4848]: I0217 09:09:20.474589 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6"} err="failed to get container status \"bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\": rpc error: code = NotFound desc = could not find container \"bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6\": container with ID starting with bbed544448b32d7e5c968c08a15f4d059cc76c5940e50124370705be53938be6 not found: ID does not exist" Feb 17 09:09:20 crc kubenswrapper[4848]: I0217 09:09:20.474604 4848 scope.go:117] "RemoveContainer" containerID="1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0" Feb 17 09:09:20 crc kubenswrapper[4848]: E0217 09:09:20.474864 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\": container with ID starting with 1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0 not found: ID does not exist" containerID="1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0" Feb 17 09:09:20 crc kubenswrapper[4848]: I0217 09:09:20.474899 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0"} err="failed to get container status \"1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\": rpc error: code = NotFound desc = could not find container \"1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0\": container with ID starting with 1b22602407ac4560c9cbda1c1c0377ff909cd36fca4e1dc9ffeca99fe8b7cde0 not found: ID does not exist" Feb 17 09:09:21 crc kubenswrapper[4848]: I0217 09:09:21.389803 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 17 09:09:22 crc kubenswrapper[4848]: E0217 09:09:22.156285 4848 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.69:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 09:09:22 crc kubenswrapper[4848]: I0217 09:09:22.156664 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 09:09:22 crc kubenswrapper[4848]: E0217 09:09:22.188022 4848 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.69:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1894fd9336f25d38 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 09:09:22.187320632 +0000 UTC m=+239.730576298,LastTimestamp:2026-02-17 09:09:22.187320632 +0000 UTC m=+239.730576298,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 09:09:22 crc kubenswrapper[4848]: I0217 09:09:22.387120 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f7189b243bcff41000812b75ea23aa670ead5dcd6ea619854af76f0f85962810"} Feb 17 09:09:23 crc kubenswrapper[4848]: I0217 09:09:23.389773 4848 status_manager.go:851] "Failed to get status for pod" podUID="5822205e-65ea-4135-816e-f7ddedd77d77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:23 crc kubenswrapper[4848]: I0217 09:09:23.397826 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"d0a3286cde90725c0d71910e02c7feb577c8a1072021cc1b2fb4447db429f3a6"} Feb 17 09:09:23 crc kubenswrapper[4848]: E0217 09:09:23.398481 4848 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.69:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 09:09:23 crc kubenswrapper[4848]: I0217 09:09:23.399409 4848 status_manager.go:851] "Failed to get status for pod" podUID="5822205e-65ea-4135-816e-f7ddedd77d77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:24 crc kubenswrapper[4848]: E0217 09:09:24.026395 4848 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:24 crc kubenswrapper[4848]: E0217 09:09:24.027107 4848 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:24 crc kubenswrapper[4848]: E0217 09:09:24.027538 4848 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:24 crc kubenswrapper[4848]: E0217 09:09:24.027895 4848 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:24 crc kubenswrapper[4848]: E0217 09:09:24.028325 4848 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:24 crc kubenswrapper[4848]: I0217 09:09:24.028383 4848 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 17 09:09:24 crc kubenswrapper[4848]: E0217 09:09:24.028880 4848 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="200ms" Feb 17 09:09:24 crc kubenswrapper[4848]: E0217 09:09:24.230068 4848 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="400ms" Feb 17 09:09:24 crc kubenswrapper[4848]: E0217 09:09:24.405171 4848 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.69:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 09:09:24 crc kubenswrapper[4848]: E0217 09:09:24.631475 4848 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="800ms" Feb 17 09:09:25 crc kubenswrapper[4848]: E0217 09:09:25.432891 4848 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="1.6s" Feb 17 09:09:27 crc kubenswrapper[4848]: E0217 09:09:27.034397 4848 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="3.2s" Feb 17 09:09:30 crc kubenswrapper[4848]: E0217 09:09:30.236315 4848 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.69:6443: connect: connection refused" interval="6.4s" Feb 17 09:09:30 crc kubenswrapper[4848]: E0217 09:09:30.468292 4848 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.69:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1894fd9336f25d38 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-17 09:09:22.187320632 +0000 UTC m=+239.730576298,LastTimestamp:2026-02-17 09:09:22.187320632 +0000 UTC m=+239.730576298,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 17 09:09:32 crc kubenswrapper[4848]: I0217 09:09:32.383115 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:09:32 crc kubenswrapper[4848]: I0217 09:09:32.384652 4848 status_manager.go:851] "Failed to get status for pod" podUID="5822205e-65ea-4135-816e-f7ddedd77d77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:32 crc kubenswrapper[4848]: I0217 09:09:32.407018 4848 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="32cd1450-32ba-4eaa-be52-0a6967aa4683" Feb 17 09:09:32 crc kubenswrapper[4848]: I0217 09:09:32.407081 4848 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="32cd1450-32ba-4eaa-be52-0a6967aa4683" Feb 17 09:09:32 crc kubenswrapper[4848]: E0217 09:09:32.407637 4848 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:09:32 crc kubenswrapper[4848]: I0217 09:09:32.408337 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:09:32 crc kubenswrapper[4848]: I0217 09:09:32.467144 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 09:09:32 crc kubenswrapper[4848]: I0217 09:09:32.467243 4848 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064" exitCode=1 Feb 17 09:09:32 crc kubenswrapper[4848]: I0217 09:09:32.467357 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064"} Feb 17 09:09:32 crc kubenswrapper[4848]: I0217 09:09:32.468302 4848 scope.go:117] "RemoveContainer" containerID="f14d3845b7775917547b3eab0a9565fe9d10639e08fa699b192f773c3ccbc064" Feb 17 09:09:32 crc kubenswrapper[4848]: I0217 09:09:32.468926 4848 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:32 crc kubenswrapper[4848]: I0217 09:09:32.469886 4848 status_manager.go:851] "Failed to get status for pod" podUID="5822205e-65ea-4135-816e-f7ddedd77d77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:32 crc kubenswrapper[4848]: I0217 09:09:32.470413 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"de9a8e70e38b71abc34747a3451d42688096ff083e21307dd230e8b1d4cefc4c"} Feb 17 09:09:33 crc kubenswrapper[4848]: I0217 09:09:33.392787 4848 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:33 crc kubenswrapper[4848]: I0217 09:09:33.393669 4848 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:33 crc kubenswrapper[4848]: I0217 09:09:33.394106 4848 status_manager.go:851] "Failed to get status for pod" podUID="5822205e-65ea-4135-816e-f7ddedd77d77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:33 crc kubenswrapper[4848]: I0217 09:09:33.481686 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 17 09:09:33 crc kubenswrapper[4848]: I0217 09:09:33.481869 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"6a7855e66403be7bc0f7d0204171de13a05190551cbaf2d91fbf38b8b3f7b52c"} Feb 17 09:09:33 crc kubenswrapper[4848]: I0217 09:09:33.483171 4848 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:33 crc kubenswrapper[4848]: I0217 09:09:33.484008 4848 status_manager.go:851] "Failed to get status for pod" podUID="5822205e-65ea-4135-816e-f7ddedd77d77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:33 crc kubenswrapper[4848]: I0217 09:09:33.484466 4848 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:33 crc kubenswrapper[4848]: I0217 09:09:33.485462 4848 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="50339025ccc156bb28f2bb7d4f259ec29fee50eb7feeea6e1ef3f63b14ce7c52" exitCode=0 Feb 17 09:09:33 crc kubenswrapper[4848]: I0217 09:09:33.485538 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"50339025ccc156bb28f2bb7d4f259ec29fee50eb7feeea6e1ef3f63b14ce7c52"} Feb 17 09:09:33 crc kubenswrapper[4848]: I0217 09:09:33.485915 4848 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="32cd1450-32ba-4eaa-be52-0a6967aa4683" Feb 17 09:09:33 crc kubenswrapper[4848]: I0217 09:09:33.485954 4848 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="32cd1450-32ba-4eaa-be52-0a6967aa4683" Feb 17 09:09:33 crc kubenswrapper[4848]: I0217 09:09:33.486648 4848 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:33 crc kubenswrapper[4848]: E0217 09:09:33.486713 4848 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:09:33 crc kubenswrapper[4848]: I0217 09:09:33.487145 4848 status_manager.go:851] "Failed to get status for pod" podUID="5822205e-65ea-4135-816e-f7ddedd77d77" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:33 crc kubenswrapper[4848]: I0217 09:09:33.487496 4848 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.69:6443: connect: connection refused" Feb 17 09:09:34 crc kubenswrapper[4848]: I0217 09:09:34.496372 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1165914fd2f43a7adf15e265ee8dc224aa5a931192da52c803a4c3842ff30747"} Feb 17 09:09:34 crc kubenswrapper[4848]: I0217 09:09:34.497041 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"95c09eb8dc4fe91ac2a9f7dab96742e3bf648046628ba5ba110b11435a3a6d8d"} Feb 17 09:09:34 crc kubenswrapper[4848]: I0217 09:09:34.497061 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"73592cb770786499845ec6894c746e4ebec3f5d19526356d3038d8db869dc10e"} Feb 17 09:09:35 crc kubenswrapper[4848]: I0217 09:09:35.507292 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"1350b8520340dd7b07ea76d586f285106b6c9c18fdf9944e562f1b170fa76322"} Feb 17 09:09:35 crc kubenswrapper[4848]: I0217 09:09:35.507658 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2145618d4b6b87cdc974990a977eb99f4cc7173460a68d2794858324d24a3843"} Feb 17 09:09:35 crc kubenswrapper[4848]: I0217 09:09:35.507733 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:09:35 crc kubenswrapper[4848]: I0217 09:09:35.507964 4848 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="32cd1450-32ba-4eaa-be52-0a6967aa4683" Feb 17 09:09:35 crc kubenswrapper[4848]: I0217 09:09:35.508003 4848 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="32cd1450-32ba-4eaa-be52-0a6967aa4683" Feb 17 09:09:37 crc kubenswrapper[4848]: I0217 09:09:37.070631 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 09:09:37 crc kubenswrapper[4848]: I0217 09:09:37.409599 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:09:37 crc kubenswrapper[4848]: I0217 09:09:37.409662 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:09:37 crc kubenswrapper[4848]: I0217 09:09:37.417582 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:09:40 crc kubenswrapper[4848]: I0217 09:09:40.516150 4848 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:09:40 crc kubenswrapper[4848]: I0217 09:09:40.535734 4848 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="32cd1450-32ba-4eaa-be52-0a6967aa4683" Feb 17 09:09:40 crc kubenswrapper[4848]: I0217 09:09:40.535783 4848 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="32cd1450-32ba-4eaa-be52-0a6967aa4683" Feb 17 09:09:40 crc kubenswrapper[4848]: I0217 09:09:40.540226 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:09:40 crc kubenswrapper[4848]: I0217 09:09:40.554982 4848 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="5864e70d-1959-4b44-b899-5cdeb3375c61" Feb 17 09:09:41 crc kubenswrapper[4848]: I0217 09:09:41.544087 4848 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="32cd1450-32ba-4eaa-be52-0a6967aa4683" Feb 17 09:09:41 crc kubenswrapper[4848]: I0217 09:09:41.544141 4848 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="32cd1450-32ba-4eaa-be52-0a6967aa4683" Feb 17 09:09:41 crc kubenswrapper[4848]: I0217 09:09:41.550378 4848 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="5864e70d-1959-4b44-b899-5cdeb3375c61" Feb 17 09:09:42 crc kubenswrapper[4848]: I0217 09:09:42.414006 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 09:09:42 crc kubenswrapper[4848]: I0217 09:09:42.420490 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 09:09:42 crc kubenswrapper[4848]: I0217 09:09:42.556412 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 17 09:09:49 crc kubenswrapper[4848]: I0217 09:09:49.886625 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 17 09:09:50 crc kubenswrapper[4848]: I0217 09:09:50.766404 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 17 09:09:50 crc kubenswrapper[4848]: I0217 09:09:50.775059 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 17 09:09:50 crc kubenswrapper[4848]: I0217 09:09:50.843095 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 17 09:09:51 crc kubenswrapper[4848]: I0217 09:09:51.091594 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 17 09:09:51 crc kubenswrapper[4848]: I0217 09:09:51.299548 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 17 09:09:51 crc kubenswrapper[4848]: I0217 09:09:51.382796 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 17 09:09:51 crc kubenswrapper[4848]: I0217 09:09:51.405661 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 17 09:09:51 crc kubenswrapper[4848]: I0217 09:09:51.638625 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 17 09:09:51 crc kubenswrapper[4848]: I0217 09:09:51.699290 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 17 09:09:51 crc kubenswrapper[4848]: I0217 09:09:51.873186 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 17 09:09:51 crc kubenswrapper[4848]: I0217 09:09:51.944603 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 17 09:09:52 crc kubenswrapper[4848]: I0217 09:09:52.021534 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 17 09:09:52 crc kubenswrapper[4848]: I0217 09:09:52.054077 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 17 09:09:52 crc kubenswrapper[4848]: I0217 09:09:52.130454 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 17 09:09:52 crc kubenswrapper[4848]: I0217 09:09:52.131880 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 17 09:09:52 crc kubenswrapper[4848]: I0217 09:09:52.429247 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 17 09:09:52 crc kubenswrapper[4848]: I0217 09:09:52.445590 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 17 09:09:52 crc kubenswrapper[4848]: I0217 09:09:52.539554 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 17 09:09:52 crc kubenswrapper[4848]: I0217 09:09:52.619971 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 17 09:09:52 crc kubenswrapper[4848]: I0217 09:09:52.718833 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 17 09:09:52 crc kubenswrapper[4848]: I0217 09:09:52.724876 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 17 09:09:52 crc kubenswrapper[4848]: I0217 09:09:52.990537 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 17 09:09:53 crc kubenswrapper[4848]: I0217 09:09:53.011261 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 17 09:09:53 crc kubenswrapper[4848]: I0217 09:09:53.035732 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 17 09:09:53 crc kubenswrapper[4848]: I0217 09:09:53.094007 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 17 09:09:53 crc kubenswrapper[4848]: I0217 09:09:53.121949 4848 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 17 09:09:53 crc kubenswrapper[4848]: I0217 09:09:53.123914 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 17 09:09:53 crc kubenswrapper[4848]: I0217 09:09:53.179385 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 17 09:09:53 crc kubenswrapper[4848]: I0217 09:09:53.191193 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 17 09:09:53 crc kubenswrapper[4848]: I0217 09:09:53.193244 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 17 09:09:53 crc kubenswrapper[4848]: I0217 09:09:53.415522 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 17 09:09:53 crc kubenswrapper[4848]: I0217 09:09:53.511416 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 17 09:09:53 crc kubenswrapper[4848]: I0217 09:09:53.531258 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 17 09:09:53 crc kubenswrapper[4848]: I0217 09:09:53.551491 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 09:09:53 crc kubenswrapper[4848]: I0217 09:09:53.600343 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 09:09:53 crc kubenswrapper[4848]: I0217 09:09:53.627948 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 17 09:09:53 crc kubenswrapper[4848]: I0217 09:09:53.648666 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 17 09:09:53 crc kubenswrapper[4848]: I0217 09:09:53.653225 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 17 09:09:53 crc kubenswrapper[4848]: I0217 09:09:53.656424 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 17 09:09:53 crc kubenswrapper[4848]: I0217 09:09:53.663220 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 17 09:09:53 crc kubenswrapper[4848]: I0217 09:09:53.890922 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 17 09:09:53 crc kubenswrapper[4848]: I0217 09:09:53.998901 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 17 09:09:54 crc kubenswrapper[4848]: I0217 09:09:54.067310 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 17 09:09:54 crc kubenswrapper[4848]: I0217 09:09:54.089710 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 17 09:09:54 crc kubenswrapper[4848]: I0217 09:09:54.090065 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 17 09:09:54 crc kubenswrapper[4848]: I0217 09:09:54.100735 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 17 09:09:54 crc kubenswrapper[4848]: I0217 09:09:54.111377 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 17 09:09:54 crc kubenswrapper[4848]: I0217 09:09:54.198910 4848 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 17 09:09:54 crc kubenswrapper[4848]: I0217 09:09:54.208487 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 09:09:54 crc kubenswrapper[4848]: I0217 09:09:54.208579 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 17 09:09:54 crc kubenswrapper[4848]: I0217 09:09:54.218018 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 17 09:09:54 crc kubenswrapper[4848]: I0217 09:09:54.218379 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 17 09:09:54 crc kubenswrapper[4848]: I0217 09:09:54.267725 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.267698242 podStartE2EDuration="14.267698242s" podCreationTimestamp="2026-02-17 09:09:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:09:54.244445749 +0000 UTC m=+271.787701405" watchObservedRunningTime="2026-02-17 09:09:54.267698242 +0000 UTC m=+271.810953958" Feb 17 09:09:54 crc kubenswrapper[4848]: I0217 09:09:54.325879 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 17 09:09:54 crc kubenswrapper[4848]: I0217 09:09:54.354663 4848 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 17 09:09:54 crc kubenswrapper[4848]: I0217 09:09:54.519603 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 17 09:09:54 crc kubenswrapper[4848]: I0217 09:09:54.549708 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 17 09:09:54 crc kubenswrapper[4848]: I0217 09:09:54.583482 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 17 09:09:54 crc kubenswrapper[4848]: I0217 09:09:54.643951 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 17 09:09:54 crc kubenswrapper[4848]: I0217 09:09:54.675114 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 17 09:09:54 crc kubenswrapper[4848]: I0217 09:09:54.699278 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 17 09:09:54 crc kubenswrapper[4848]: I0217 09:09:54.864084 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 17 09:09:54 crc kubenswrapper[4848]: I0217 09:09:54.908656 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 17 09:09:55 crc kubenswrapper[4848]: I0217 09:09:55.005051 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 17 09:09:55 crc kubenswrapper[4848]: I0217 09:09:55.042266 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 17 09:09:55 crc kubenswrapper[4848]: I0217 09:09:55.043807 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 17 09:09:55 crc kubenswrapper[4848]: I0217 09:09:55.167955 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 17 09:09:55 crc kubenswrapper[4848]: I0217 09:09:55.270421 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 17 09:09:55 crc kubenswrapper[4848]: I0217 09:09:55.285399 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 17 09:09:55 crc kubenswrapper[4848]: I0217 09:09:55.287292 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 17 09:09:55 crc kubenswrapper[4848]: I0217 09:09:55.288577 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 17 09:09:55 crc kubenswrapper[4848]: I0217 09:09:55.441572 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 17 09:09:55 crc kubenswrapper[4848]: I0217 09:09:55.451540 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 17 09:09:55 crc kubenswrapper[4848]: I0217 09:09:55.530520 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 17 09:09:55 crc kubenswrapper[4848]: I0217 09:09:55.538948 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 17 09:09:55 crc kubenswrapper[4848]: I0217 09:09:55.555052 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 17 09:09:55 crc kubenswrapper[4848]: I0217 09:09:55.596529 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 17 09:09:55 crc kubenswrapper[4848]: I0217 09:09:55.810447 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 17 09:09:55 crc kubenswrapper[4848]: I0217 09:09:55.909804 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 17 09:09:56 crc kubenswrapper[4848]: I0217 09:09:56.049204 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 17 09:09:56 crc kubenswrapper[4848]: I0217 09:09:56.137534 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 17 09:09:56 crc kubenswrapper[4848]: I0217 09:09:56.277672 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 17 09:09:56 crc kubenswrapper[4848]: I0217 09:09:56.338856 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 17 09:09:56 crc kubenswrapper[4848]: I0217 09:09:56.421308 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 17 09:09:56 crc kubenswrapper[4848]: I0217 09:09:56.423896 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 17 09:09:56 crc kubenswrapper[4848]: I0217 09:09:56.511587 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 17 09:09:56 crc kubenswrapper[4848]: I0217 09:09:56.663909 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 17 09:09:56 crc kubenswrapper[4848]: I0217 09:09:56.698313 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 17 09:09:56 crc kubenswrapper[4848]: I0217 09:09:56.719033 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 17 09:09:56 crc kubenswrapper[4848]: I0217 09:09:56.731594 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 17 09:09:56 crc kubenswrapper[4848]: I0217 09:09:56.757102 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 17 09:09:56 crc kubenswrapper[4848]: I0217 09:09:56.892861 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 17 09:09:56 crc kubenswrapper[4848]: I0217 09:09:56.896085 4848 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 17 09:09:56 crc kubenswrapper[4848]: I0217 09:09:56.961512 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 17 09:09:56 crc kubenswrapper[4848]: I0217 09:09:56.981442 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 17 09:09:57 crc kubenswrapper[4848]: I0217 09:09:57.038045 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 17 09:09:57 crc kubenswrapper[4848]: I0217 09:09:57.039746 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 17 09:09:57 crc kubenswrapper[4848]: I0217 09:09:57.044390 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 17 09:09:57 crc kubenswrapper[4848]: I0217 09:09:57.079072 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 17 09:09:57 crc kubenswrapper[4848]: I0217 09:09:57.156183 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 17 09:09:57 crc kubenswrapper[4848]: I0217 09:09:57.239371 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 17 09:09:57 crc kubenswrapper[4848]: I0217 09:09:57.387474 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 17 09:09:57 crc kubenswrapper[4848]: I0217 09:09:57.411412 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 17 09:09:57 crc kubenswrapper[4848]: I0217 09:09:57.549708 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 17 09:09:57 crc kubenswrapper[4848]: I0217 09:09:57.564104 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 17 09:09:57 crc kubenswrapper[4848]: I0217 09:09:57.564221 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 17 09:09:57 crc kubenswrapper[4848]: I0217 09:09:57.653594 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 17 09:09:57 crc kubenswrapper[4848]: I0217 09:09:57.765359 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 17 09:09:57 crc kubenswrapper[4848]: I0217 09:09:57.783858 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 17 09:09:57 crc kubenswrapper[4848]: I0217 09:09:57.823617 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 17 09:09:57 crc kubenswrapper[4848]: I0217 09:09:57.870230 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 17 09:09:57 crc kubenswrapper[4848]: I0217 09:09:57.995282 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 17 09:09:58 crc kubenswrapper[4848]: I0217 09:09:58.032606 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 17 09:09:58 crc kubenswrapper[4848]: I0217 09:09:58.055236 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 09:09:58 crc kubenswrapper[4848]: I0217 09:09:58.080416 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 17 09:09:58 crc kubenswrapper[4848]: I0217 09:09:58.123843 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 17 09:09:58 crc kubenswrapper[4848]: I0217 09:09:58.217426 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 17 09:09:58 crc kubenswrapper[4848]: I0217 09:09:58.227452 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 17 09:09:58 crc kubenswrapper[4848]: I0217 09:09:58.262513 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 17 09:09:58 crc kubenswrapper[4848]: I0217 09:09:58.315797 4848 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 17 09:09:58 crc kubenswrapper[4848]: I0217 09:09:58.320491 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 17 09:09:58 crc kubenswrapper[4848]: I0217 09:09:58.511155 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 17 09:09:58 crc kubenswrapper[4848]: I0217 09:09:58.526304 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 17 09:09:58 crc kubenswrapper[4848]: I0217 09:09:58.583913 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 17 09:09:58 crc kubenswrapper[4848]: I0217 09:09:58.647901 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 17 09:09:58 crc kubenswrapper[4848]: I0217 09:09:58.688280 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 17 09:09:58 crc kubenswrapper[4848]: I0217 09:09:58.692012 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 17 09:09:58 crc kubenswrapper[4848]: I0217 09:09:58.739960 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 17 09:09:58 crc kubenswrapper[4848]: I0217 09:09:58.765053 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 17 09:09:58 crc kubenswrapper[4848]: I0217 09:09:58.826737 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 17 09:09:58 crc kubenswrapper[4848]: I0217 09:09:58.931208 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 17 09:09:58 crc kubenswrapper[4848]: I0217 09:09:58.992332 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.010906 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.011642 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.012884 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.132030 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.161878 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.192263 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.197623 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.205524 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.246668 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.265251 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.267288 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.279932 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.319081 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.325037 4848 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.451613 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.461439 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.478406 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.533740 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.563821 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.596788 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.601899 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.614826 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.679380 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.691445 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.804910 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.911943 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.929534 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.949491 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.950127 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 17 09:09:59 crc kubenswrapper[4848]: I0217 09:09:59.983124 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 17 09:10:00 crc kubenswrapper[4848]: I0217 09:10:00.073510 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 17 09:10:00 crc kubenswrapper[4848]: I0217 09:10:00.183086 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 17 09:10:00 crc kubenswrapper[4848]: I0217 09:10:00.189622 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 17 09:10:00 crc kubenswrapper[4848]: I0217 09:10:00.302140 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 17 09:10:00 crc kubenswrapper[4848]: I0217 09:10:00.384970 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 17 09:10:00 crc kubenswrapper[4848]: I0217 09:10:00.408154 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 17 09:10:00 crc kubenswrapper[4848]: I0217 09:10:00.458429 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 17 09:10:00 crc kubenswrapper[4848]: I0217 09:10:00.483900 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 17 09:10:00 crc kubenswrapper[4848]: I0217 09:10:00.582505 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 17 09:10:00 crc kubenswrapper[4848]: I0217 09:10:00.588485 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 17 09:10:00 crc kubenswrapper[4848]: I0217 09:10:00.662636 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 17 09:10:00 crc kubenswrapper[4848]: I0217 09:10:00.758444 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 17 09:10:00 crc kubenswrapper[4848]: I0217 09:10:00.780580 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 17 09:10:00 crc kubenswrapper[4848]: I0217 09:10:00.895448 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 17 09:10:00 crc kubenswrapper[4848]: I0217 09:10:00.944300 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 17 09:10:01 crc kubenswrapper[4848]: I0217 09:10:01.084745 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 17 09:10:01 crc kubenswrapper[4848]: I0217 09:10:01.129704 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 17 09:10:01 crc kubenswrapper[4848]: I0217 09:10:01.163294 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 17 09:10:01 crc kubenswrapper[4848]: I0217 09:10:01.276652 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 17 09:10:01 crc kubenswrapper[4848]: I0217 09:10:01.349322 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 17 09:10:01 crc kubenswrapper[4848]: I0217 09:10:01.396876 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 17 09:10:01 crc kubenswrapper[4848]: I0217 09:10:01.433776 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 17 09:10:01 crc kubenswrapper[4848]: I0217 09:10:01.497979 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 17 09:10:01 crc kubenswrapper[4848]: I0217 09:10:01.525226 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 17 09:10:01 crc kubenswrapper[4848]: I0217 09:10:01.533256 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 17 09:10:01 crc kubenswrapper[4848]: I0217 09:10:01.596540 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 17 09:10:01 crc kubenswrapper[4848]: I0217 09:10:01.614956 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 17 09:10:01 crc kubenswrapper[4848]: I0217 09:10:01.683475 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 17 09:10:01 crc kubenswrapper[4848]: I0217 09:10:01.746094 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 17 09:10:01 crc kubenswrapper[4848]: I0217 09:10:01.773365 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 17 09:10:01 crc kubenswrapper[4848]: I0217 09:10:01.802104 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 17 09:10:01 crc kubenswrapper[4848]: I0217 09:10:01.809357 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 17 09:10:01 crc kubenswrapper[4848]: I0217 09:10:01.847717 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 17 09:10:01 crc kubenswrapper[4848]: I0217 09:10:01.892058 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 17 09:10:01 crc kubenswrapper[4848]: I0217 09:10:01.897237 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 17 09:10:01 crc kubenswrapper[4848]: I0217 09:10:01.911149 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 17 09:10:01 crc kubenswrapper[4848]: I0217 09:10:01.989500 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 17 09:10:02 crc kubenswrapper[4848]: I0217 09:10:02.062581 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 17 09:10:02 crc kubenswrapper[4848]: I0217 09:10:02.082479 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 17 09:10:02 crc kubenswrapper[4848]: I0217 09:10:02.101066 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 17 09:10:02 crc kubenswrapper[4848]: I0217 09:10:02.120234 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 17 09:10:02 crc kubenswrapper[4848]: I0217 09:10:02.171627 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 17 09:10:02 crc kubenswrapper[4848]: I0217 09:10:02.189786 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 17 09:10:02 crc kubenswrapper[4848]: I0217 09:10:02.348719 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 17 09:10:02 crc kubenswrapper[4848]: I0217 09:10:02.367977 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 17 09:10:02 crc kubenswrapper[4848]: I0217 09:10:02.380036 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 17 09:10:02 crc kubenswrapper[4848]: I0217 09:10:02.468333 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 17 09:10:02 crc kubenswrapper[4848]: I0217 09:10:02.468544 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 17 09:10:02 crc kubenswrapper[4848]: I0217 09:10:02.572327 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 17 09:10:02 crc kubenswrapper[4848]: I0217 09:10:02.648242 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 17 09:10:02 crc kubenswrapper[4848]: I0217 09:10:02.740716 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 17 09:10:02 crc kubenswrapper[4848]: I0217 09:10:02.754029 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 17 09:10:02 crc kubenswrapper[4848]: I0217 09:10:02.899013 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 17 09:10:02 crc kubenswrapper[4848]: I0217 09:10:02.947389 4848 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 17 09:10:02 crc kubenswrapper[4848]: I0217 09:10:02.949026 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://d0a3286cde90725c0d71910e02c7feb577c8a1072021cc1b2fb4447db429f3a6" gracePeriod=5 Feb 17 09:10:03 crc kubenswrapper[4848]: I0217 09:10:03.106501 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 17 09:10:03 crc kubenswrapper[4848]: I0217 09:10:03.110215 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 17 09:10:03 crc kubenswrapper[4848]: I0217 09:10:03.133424 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 17 09:10:03 crc kubenswrapper[4848]: I0217 09:10:03.240713 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 17 09:10:03 crc kubenswrapper[4848]: I0217 09:10:03.425438 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 17 09:10:03 crc kubenswrapper[4848]: I0217 09:10:03.520720 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 17 09:10:03 crc kubenswrapper[4848]: I0217 09:10:03.571480 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 17 09:10:03 crc kubenswrapper[4848]: I0217 09:10:03.733448 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 17 09:10:03 crc kubenswrapper[4848]: I0217 09:10:03.753729 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 17 09:10:03 crc kubenswrapper[4848]: I0217 09:10:03.800672 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 17 09:10:03 crc kubenswrapper[4848]: I0217 09:10:03.809708 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 17 09:10:04 crc kubenswrapper[4848]: I0217 09:10:04.017087 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 17 09:10:04 crc kubenswrapper[4848]: I0217 09:10:04.028965 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 17 09:10:04 crc kubenswrapper[4848]: I0217 09:10:04.175864 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 17 09:10:04 crc kubenswrapper[4848]: I0217 09:10:04.418009 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 17 09:10:04 crc kubenswrapper[4848]: I0217 09:10:04.439298 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 17 09:10:04 crc kubenswrapper[4848]: I0217 09:10:04.447965 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 17 09:10:04 crc kubenswrapper[4848]: I0217 09:10:04.463595 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 17 09:10:04 crc kubenswrapper[4848]: I0217 09:10:04.552262 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 17 09:10:04 crc kubenswrapper[4848]: I0217 09:10:04.623308 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 17 09:10:04 crc kubenswrapper[4848]: I0217 09:10:04.977417 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 17 09:10:05 crc kubenswrapper[4848]: I0217 09:10:05.342641 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 17 09:10:05 crc kubenswrapper[4848]: I0217 09:10:05.743130 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 17 09:10:05 crc kubenswrapper[4848]: I0217 09:10:05.750639 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 17 09:10:05 crc kubenswrapper[4848]: I0217 09:10:05.798718 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.217984 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4wt92"] Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.218249 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4wt92" podUID="09ac4713-0dcb-4908-8063-b6e029c132d7" containerName="registry-server" containerID="cri-o://30879b3becc77393014c722356aff2e79ed08d116060d24a6d8a83f6bb6db6fe" gracePeriod=30 Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.225928 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8rb4g"] Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.226191 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8rb4g" podUID="38529e93-75d3-4b08-a3dc-939fab0cbf66" containerName="registry-server" containerID="cri-o://5fba36ff660c78451338f51cde4a26f30f2655f77c1f5f7663b7388d93c29793" gracePeriod=30 Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.237635 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mq695"] Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.237925 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-mq695" podUID="3afb3ce4-f468-4042-b4d5-61285893e7e1" containerName="marketplace-operator" containerID="cri-o://7c5df998e064bbb570c4d9a8c2c6574ff6e0a24bf26f1944173619b295654039" gracePeriod=30 Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.251248 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9rm9"] Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.251598 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j9rm9" podUID="f224f79c-f6d1-442a-be23-4fc8e7527d3a" containerName="registry-server" containerID="cri-o://016c29e3e70c33bfad969b7a77460d8ed18f621e0c1997b2e20c3e1d1517a198" gracePeriod=30 Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.260641 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-45twr"] Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.261207 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-45twr" podUID="b3a74de7-62f1-46c2-b518-6baa8b222b1b" containerName="registry-server" containerID="cri-o://0dfd6374652a02df264a081a6853759e908e8426ce7b5e274a031b6b96f0c01a" gracePeriod=30 Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.271834 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xvh8l"] Feb 17 09:10:06 crc kubenswrapper[4848]: E0217 09:10:06.272109 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5822205e-65ea-4135-816e-f7ddedd77d77" containerName="installer" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.272132 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="5822205e-65ea-4135-816e-f7ddedd77d77" containerName="installer" Feb 17 09:10:06 crc kubenswrapper[4848]: E0217 09:10:06.272147 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.272155 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.272269 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.272287 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="5822205e-65ea-4135-816e-f7ddedd77d77" containerName="installer" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.272780 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.286558 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xvh8l"] Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.441614 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd7df0ba-ff8c-48ce-ad07-8ac50003f318-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xvh8l\" (UID: \"cd7df0ba-ff8c-48ce-ad07-8ac50003f318\") " pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.441867 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cd7df0ba-ff8c-48ce-ad07-8ac50003f318-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xvh8l\" (UID: \"cd7df0ba-ff8c-48ce-ad07-8ac50003f318\") " pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.441927 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zfcl\" (UniqueName: \"kubernetes.io/projected/cd7df0ba-ff8c-48ce-ad07-8ac50003f318-kube-api-access-8zfcl\") pod \"marketplace-operator-79b997595-xvh8l\" (UID: \"cd7df0ba-ff8c-48ce-ad07-8ac50003f318\") " pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.543600 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cd7df0ba-ff8c-48ce-ad07-8ac50003f318-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xvh8l\" (UID: \"cd7df0ba-ff8c-48ce-ad07-8ac50003f318\") " pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.543668 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zfcl\" (UniqueName: \"kubernetes.io/projected/cd7df0ba-ff8c-48ce-ad07-8ac50003f318-kube-api-access-8zfcl\") pod \"marketplace-operator-79b997595-xvh8l\" (UID: \"cd7df0ba-ff8c-48ce-ad07-8ac50003f318\") " pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.543720 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd7df0ba-ff8c-48ce-ad07-8ac50003f318-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xvh8l\" (UID: \"cd7df0ba-ff8c-48ce-ad07-8ac50003f318\") " pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.545114 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cd7df0ba-ff8c-48ce-ad07-8ac50003f318-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-xvh8l\" (UID: \"cd7df0ba-ff8c-48ce-ad07-8ac50003f318\") " pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.553099 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/cd7df0ba-ff8c-48ce-ad07-8ac50003f318-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-xvh8l\" (UID: \"cd7df0ba-ff8c-48ce-ad07-8ac50003f318\") " pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.564181 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zfcl\" (UniqueName: \"kubernetes.io/projected/cd7df0ba-ff8c-48ce-ad07-8ac50003f318-kube-api-access-8zfcl\") pod \"marketplace-operator-79b997595-xvh8l\" (UID: \"cd7df0ba-ff8c-48ce-ad07-8ac50003f318\") " pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.704626 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.712082 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4wt92" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.712631 4848 generic.go:334] "Generic (PLEG): container finished" podID="09ac4713-0dcb-4908-8063-b6e029c132d7" containerID="30879b3becc77393014c722356aff2e79ed08d116060d24a6d8a83f6bb6db6fe" exitCode=0 Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.712711 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wt92" event={"ID":"09ac4713-0dcb-4908-8063-b6e029c132d7","Type":"ContainerDied","Data":"30879b3becc77393014c722356aff2e79ed08d116060d24a6d8a83f6bb6db6fe"} Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.712779 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4wt92" event={"ID":"09ac4713-0dcb-4908-8063-b6e029c132d7","Type":"ContainerDied","Data":"33f0aaca82ba82440176ffed2342fc83689e9a9796c5a5fdaeab783480d199b1"} Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.712801 4848 scope.go:117] "RemoveContainer" containerID="30879b3becc77393014c722356aff2e79ed08d116060d24a6d8a83f6bb6db6fe" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.719998 4848 generic.go:334] "Generic (PLEG): container finished" podID="38529e93-75d3-4b08-a3dc-939fab0cbf66" containerID="5fba36ff660c78451338f51cde4a26f30f2655f77c1f5f7663b7388d93c29793" exitCode=0 Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.720047 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rb4g" event={"ID":"38529e93-75d3-4b08-a3dc-939fab0cbf66","Type":"ContainerDied","Data":"5fba36ff660c78451338f51cde4a26f30f2655f77c1f5f7663b7388d93c29793"} Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.723095 4848 generic.go:334] "Generic (PLEG): container finished" podID="3afb3ce4-f468-4042-b4d5-61285893e7e1" containerID="7c5df998e064bbb570c4d9a8c2c6574ff6e0a24bf26f1944173619b295654039" exitCode=0 Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.723132 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mq695" event={"ID":"3afb3ce4-f468-4042-b4d5-61285893e7e1","Type":"ContainerDied","Data":"7c5df998e064bbb570c4d9a8c2c6574ff6e0a24bf26f1944173619b295654039"} Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.733182 4848 generic.go:334] "Generic (PLEG): container finished" podID="b3a74de7-62f1-46c2-b518-6baa8b222b1b" containerID="0dfd6374652a02df264a081a6853759e908e8426ce7b5e274a031b6b96f0c01a" exitCode=0 Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.733237 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45twr" event={"ID":"b3a74de7-62f1-46c2-b518-6baa8b222b1b","Type":"ContainerDied","Data":"0dfd6374652a02df264a081a6853759e908e8426ce7b5e274a031b6b96f0c01a"} Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.740917 4848 generic.go:334] "Generic (PLEG): container finished" podID="f224f79c-f6d1-442a-be23-4fc8e7527d3a" containerID="016c29e3e70c33bfad969b7a77460d8ed18f621e0c1997b2e20c3e1d1517a198" exitCode=0 Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.740964 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9rm9" event={"ID":"f224f79c-f6d1-442a-be23-4fc8e7527d3a","Type":"ContainerDied","Data":"016c29e3e70c33bfad969b7a77460d8ed18f621e0c1997b2e20c3e1d1517a198"} Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.758742 4848 scope.go:117] "RemoveContainer" containerID="1fcc8d53f6d7179b3f218d4dbd02c3023bedde71cb14bda3a94a3f5729df28ce" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.776950 4848 scope.go:117] "RemoveContainer" containerID="1781b99b2b68625c19a3b2171d2473bc06b17540859b0a637e0a57f87e8fee43" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.797534 4848 scope.go:117] "RemoveContainer" containerID="30879b3becc77393014c722356aff2e79ed08d116060d24a6d8a83f6bb6db6fe" Feb 17 09:10:06 crc kubenswrapper[4848]: E0217 09:10:06.798030 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30879b3becc77393014c722356aff2e79ed08d116060d24a6d8a83f6bb6db6fe\": container with ID starting with 30879b3becc77393014c722356aff2e79ed08d116060d24a6d8a83f6bb6db6fe not found: ID does not exist" containerID="30879b3becc77393014c722356aff2e79ed08d116060d24a6d8a83f6bb6db6fe" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.798061 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30879b3becc77393014c722356aff2e79ed08d116060d24a6d8a83f6bb6db6fe"} err="failed to get container status \"30879b3becc77393014c722356aff2e79ed08d116060d24a6d8a83f6bb6db6fe\": rpc error: code = NotFound desc = could not find container \"30879b3becc77393014c722356aff2e79ed08d116060d24a6d8a83f6bb6db6fe\": container with ID starting with 30879b3becc77393014c722356aff2e79ed08d116060d24a6d8a83f6bb6db6fe not found: ID does not exist" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.798087 4848 scope.go:117] "RemoveContainer" containerID="1fcc8d53f6d7179b3f218d4dbd02c3023bedde71cb14bda3a94a3f5729df28ce" Feb 17 09:10:06 crc kubenswrapper[4848]: E0217 09:10:06.798957 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fcc8d53f6d7179b3f218d4dbd02c3023bedde71cb14bda3a94a3f5729df28ce\": container with ID starting with 1fcc8d53f6d7179b3f218d4dbd02c3023bedde71cb14bda3a94a3f5729df28ce not found: ID does not exist" containerID="1fcc8d53f6d7179b3f218d4dbd02c3023bedde71cb14bda3a94a3f5729df28ce" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.798982 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fcc8d53f6d7179b3f218d4dbd02c3023bedde71cb14bda3a94a3f5729df28ce"} err="failed to get container status \"1fcc8d53f6d7179b3f218d4dbd02c3023bedde71cb14bda3a94a3f5729df28ce\": rpc error: code = NotFound desc = could not find container \"1fcc8d53f6d7179b3f218d4dbd02c3023bedde71cb14bda3a94a3f5729df28ce\": container with ID starting with 1fcc8d53f6d7179b3f218d4dbd02c3023bedde71cb14bda3a94a3f5729df28ce not found: ID does not exist" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.798995 4848 scope.go:117] "RemoveContainer" containerID="1781b99b2b68625c19a3b2171d2473bc06b17540859b0a637e0a57f87e8fee43" Feb 17 09:10:06 crc kubenswrapper[4848]: E0217 09:10:06.799224 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1781b99b2b68625c19a3b2171d2473bc06b17540859b0a637e0a57f87e8fee43\": container with ID starting with 1781b99b2b68625c19a3b2171d2473bc06b17540859b0a637e0a57f87e8fee43 not found: ID does not exist" containerID="1781b99b2b68625c19a3b2171d2473bc06b17540859b0a637e0a57f87e8fee43" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.799243 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1781b99b2b68625c19a3b2171d2473bc06b17540859b0a637e0a57f87e8fee43"} err="failed to get container status \"1781b99b2b68625c19a3b2171d2473bc06b17540859b0a637e0a57f87e8fee43\": rpc error: code = NotFound desc = could not find container \"1781b99b2b68625c19a3b2171d2473bc06b17540859b0a637e0a57f87e8fee43\": container with ID starting with 1781b99b2b68625c19a3b2171d2473bc06b17540859b0a637e0a57f87e8fee43 not found: ID does not exist" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.810357 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mq695" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.818033 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-45twr" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.823898 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j9rm9" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.826842 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8rb4g" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.847289 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-87hq6\" (UniqueName: \"kubernetes.io/projected/09ac4713-0dcb-4908-8063-b6e029c132d7-kube-api-access-87hq6\") pod \"09ac4713-0dcb-4908-8063-b6e029c132d7\" (UID: \"09ac4713-0dcb-4908-8063-b6e029c132d7\") " Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.847374 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09ac4713-0dcb-4908-8063-b6e029c132d7-catalog-content\") pod \"09ac4713-0dcb-4908-8063-b6e029c132d7\" (UID: \"09ac4713-0dcb-4908-8063-b6e029c132d7\") " Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.847467 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09ac4713-0dcb-4908-8063-b6e029c132d7-utilities\") pod \"09ac4713-0dcb-4908-8063-b6e029c132d7\" (UID: \"09ac4713-0dcb-4908-8063-b6e029c132d7\") " Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.848553 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09ac4713-0dcb-4908-8063-b6e029c132d7-utilities" (OuterVolumeSpecName: "utilities") pod "09ac4713-0dcb-4908-8063-b6e029c132d7" (UID: "09ac4713-0dcb-4908-8063-b6e029c132d7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.851662 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ac4713-0dcb-4908-8063-b6e029c132d7-kube-api-access-87hq6" (OuterVolumeSpecName: "kube-api-access-87hq6") pod "09ac4713-0dcb-4908-8063-b6e029c132d7" (UID: "09ac4713-0dcb-4908-8063-b6e029c132d7"). InnerVolumeSpecName "kube-api-access-87hq6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.903751 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09ac4713-0dcb-4908-8063-b6e029c132d7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09ac4713-0dcb-4908-8063-b6e029c132d7" (UID: "09ac4713-0dcb-4908-8063-b6e029c132d7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.948406 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjr7q\" (UniqueName: \"kubernetes.io/projected/f224f79c-f6d1-442a-be23-4fc8e7527d3a-kube-api-access-fjr7q\") pod \"f224f79c-f6d1-442a-be23-4fc8e7527d3a\" (UID: \"f224f79c-f6d1-442a-be23-4fc8e7527d3a\") " Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.948456 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3afb3ce4-f468-4042-b4d5-61285893e7e1-marketplace-operator-metrics\") pod \"3afb3ce4-f468-4042-b4d5-61285893e7e1\" (UID: \"3afb3ce4-f468-4042-b4d5-61285893e7e1\") " Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.948482 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3a74de7-62f1-46c2-b518-6baa8b222b1b-utilities\") pod \"b3a74de7-62f1-46c2-b518-6baa8b222b1b\" (UID: \"b3a74de7-62f1-46c2-b518-6baa8b222b1b\") " Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.948540 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3afb3ce4-f468-4042-b4d5-61285893e7e1-marketplace-trusted-ca\") pod \"3afb3ce4-f468-4042-b4d5-61285893e7e1\" (UID: \"3afb3ce4-f468-4042-b4d5-61285893e7e1\") " Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.948560 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f224f79c-f6d1-442a-be23-4fc8e7527d3a-utilities\") pod \"f224f79c-f6d1-442a-be23-4fc8e7527d3a\" (UID: \"f224f79c-f6d1-442a-be23-4fc8e7527d3a\") " Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.948598 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wq94s\" (UniqueName: \"kubernetes.io/projected/3afb3ce4-f468-4042-b4d5-61285893e7e1-kube-api-access-wq94s\") pod \"3afb3ce4-f468-4042-b4d5-61285893e7e1\" (UID: \"3afb3ce4-f468-4042-b4d5-61285893e7e1\") " Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.948615 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f224f79c-f6d1-442a-be23-4fc8e7527d3a-catalog-content\") pod \"f224f79c-f6d1-442a-be23-4fc8e7527d3a\" (UID: \"f224f79c-f6d1-442a-be23-4fc8e7527d3a\") " Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.948634 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38529e93-75d3-4b08-a3dc-939fab0cbf66-utilities\") pod \"38529e93-75d3-4b08-a3dc-939fab0cbf66\" (UID: \"38529e93-75d3-4b08-a3dc-939fab0cbf66\") " Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.948651 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38529e93-75d3-4b08-a3dc-939fab0cbf66-catalog-content\") pod \"38529e93-75d3-4b08-a3dc-939fab0cbf66\" (UID: \"38529e93-75d3-4b08-a3dc-939fab0cbf66\") " Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.948683 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3a74de7-62f1-46c2-b518-6baa8b222b1b-catalog-content\") pod \"b3a74de7-62f1-46c2-b518-6baa8b222b1b\" (UID: \"b3a74de7-62f1-46c2-b518-6baa8b222b1b\") " Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.948708 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xlqg7\" (UniqueName: \"kubernetes.io/projected/b3a74de7-62f1-46c2-b518-6baa8b222b1b-kube-api-access-xlqg7\") pod \"b3a74de7-62f1-46c2-b518-6baa8b222b1b\" (UID: \"b3a74de7-62f1-46c2-b518-6baa8b222b1b\") " Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.948734 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crzcq\" (UniqueName: \"kubernetes.io/projected/38529e93-75d3-4b08-a3dc-939fab0cbf66-kube-api-access-crzcq\") pod \"38529e93-75d3-4b08-a3dc-939fab0cbf66\" (UID: \"38529e93-75d3-4b08-a3dc-939fab0cbf66\") " Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.948913 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09ac4713-0dcb-4908-8063-b6e029c132d7-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.948925 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-87hq6\" (UniqueName: \"kubernetes.io/projected/09ac4713-0dcb-4908-8063-b6e029c132d7-kube-api-access-87hq6\") on node \"crc\" DevicePath \"\"" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.948934 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09ac4713-0dcb-4908-8063-b6e029c132d7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.949809 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38529e93-75d3-4b08-a3dc-939fab0cbf66-utilities" (OuterVolumeSpecName: "utilities") pod "38529e93-75d3-4b08-a3dc-939fab0cbf66" (UID: "38529e93-75d3-4b08-a3dc-939fab0cbf66"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.950142 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3afb3ce4-f468-4042-b4d5-61285893e7e1-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "3afb3ce4-f468-4042-b4d5-61285893e7e1" (UID: "3afb3ce4-f468-4042-b4d5-61285893e7e1"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.950664 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f224f79c-f6d1-442a-be23-4fc8e7527d3a-utilities" (OuterVolumeSpecName: "utilities") pod "f224f79c-f6d1-442a-be23-4fc8e7527d3a" (UID: "f224f79c-f6d1-442a-be23-4fc8e7527d3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.950718 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3a74de7-62f1-46c2-b518-6baa8b222b1b-utilities" (OuterVolumeSpecName: "utilities") pod "b3a74de7-62f1-46c2-b518-6baa8b222b1b" (UID: "b3a74de7-62f1-46c2-b518-6baa8b222b1b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.952815 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38529e93-75d3-4b08-a3dc-939fab0cbf66-kube-api-access-crzcq" (OuterVolumeSpecName: "kube-api-access-crzcq") pod "38529e93-75d3-4b08-a3dc-939fab0cbf66" (UID: "38529e93-75d3-4b08-a3dc-939fab0cbf66"). InnerVolumeSpecName "kube-api-access-crzcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.954123 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afb3ce4-f468-4042-b4d5-61285893e7e1-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "3afb3ce4-f468-4042-b4d5-61285893e7e1" (UID: "3afb3ce4-f468-4042-b4d5-61285893e7e1"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.954186 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f224f79c-f6d1-442a-be23-4fc8e7527d3a-kube-api-access-fjr7q" (OuterVolumeSpecName: "kube-api-access-fjr7q") pod "f224f79c-f6d1-442a-be23-4fc8e7527d3a" (UID: "f224f79c-f6d1-442a-be23-4fc8e7527d3a"). InnerVolumeSpecName "kube-api-access-fjr7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.955848 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3afb3ce4-f468-4042-b4d5-61285893e7e1-kube-api-access-wq94s" (OuterVolumeSpecName: "kube-api-access-wq94s") pod "3afb3ce4-f468-4042-b4d5-61285893e7e1" (UID: "3afb3ce4-f468-4042-b4d5-61285893e7e1"). InnerVolumeSpecName "kube-api-access-wq94s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.964952 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3a74de7-62f1-46c2-b518-6baa8b222b1b-kube-api-access-xlqg7" (OuterVolumeSpecName: "kube-api-access-xlqg7") pod "b3a74de7-62f1-46c2-b518-6baa8b222b1b" (UID: "b3a74de7-62f1-46c2-b518-6baa8b222b1b"). InnerVolumeSpecName "kube-api-access-xlqg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:10:06 crc kubenswrapper[4848]: I0217 09:10:06.991673 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f224f79c-f6d1-442a-be23-4fc8e7527d3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f224f79c-f6d1-442a-be23-4fc8e7527d3a" (UID: "f224f79c-f6d1-442a-be23-4fc8e7527d3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.007587 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38529e93-75d3-4b08-a3dc-939fab0cbf66-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38529e93-75d3-4b08-a3dc-939fab0cbf66" (UID: "38529e93-75d3-4b08-a3dc-939fab0cbf66"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.050081 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wq94s\" (UniqueName: \"kubernetes.io/projected/3afb3ce4-f468-4042-b4d5-61285893e7e1-kube-api-access-wq94s\") on node \"crc\" DevicePath \"\"" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.050117 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f224f79c-f6d1-442a-be23-4fc8e7527d3a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.050141 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38529e93-75d3-4b08-a3dc-939fab0cbf66-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.050152 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38529e93-75d3-4b08-a3dc-939fab0cbf66-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.050163 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xlqg7\" (UniqueName: \"kubernetes.io/projected/b3a74de7-62f1-46c2-b518-6baa8b222b1b-kube-api-access-xlqg7\") on node \"crc\" DevicePath \"\"" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.050173 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crzcq\" (UniqueName: \"kubernetes.io/projected/38529e93-75d3-4b08-a3dc-939fab0cbf66-kube-api-access-crzcq\") on node \"crc\" DevicePath \"\"" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.050181 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjr7q\" (UniqueName: \"kubernetes.io/projected/f224f79c-f6d1-442a-be23-4fc8e7527d3a-kube-api-access-fjr7q\") on node \"crc\" DevicePath \"\"" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.050190 4848 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/3afb3ce4-f468-4042-b4d5-61285893e7e1-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.050198 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3a74de7-62f1-46c2-b518-6baa8b222b1b-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.050221 4848 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3afb3ce4-f468-4042-b4d5-61285893e7e1-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.050232 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f224f79c-f6d1-442a-be23-4fc8e7527d3a-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.064727 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3a74de7-62f1-46c2-b518-6baa8b222b1b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3a74de7-62f1-46c2-b518-6baa8b222b1b" (UID: "b3a74de7-62f1-46c2-b518-6baa8b222b1b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.151690 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3a74de7-62f1-46c2-b518-6baa8b222b1b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.505675 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.748885 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-mq695" event={"ID":"3afb3ce4-f468-4042-b4d5-61285893e7e1","Type":"ContainerDied","Data":"73f058631bb87c1327ec8f52182a3b757adfb0948790bcec22ef28865325de58"} Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.749029 4848 scope.go:117] "RemoveContainer" containerID="7c5df998e064bbb570c4d9a8c2c6574ff6e0a24bf26f1944173619b295654039" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.748946 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-mq695" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.752183 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-45twr" event={"ID":"b3a74de7-62f1-46c2-b518-6baa8b222b1b","Type":"ContainerDied","Data":"90b19a261bae92d72c069e469b412c718ce9bd6ff89a21e491c6a399336b8ca2"} Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.752466 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-45twr" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.756119 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j9rm9" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.756320 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j9rm9" event={"ID":"f224f79c-f6d1-442a-be23-4fc8e7527d3a","Type":"ContainerDied","Data":"5bd4d5e9df75f7899de3623f31cd68265db667b82f244bcd0294608277e0407b"} Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.757367 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4wt92" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.761288 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8rb4g" event={"ID":"38529e93-75d3-4b08-a3dc-939fab0cbf66","Type":"ContainerDied","Data":"c479c8c4c09008225dffab0f49b8e5c1b34d719df315f3fecde26faff08c1db2"} Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.761416 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8rb4g" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.790754 4848 scope.go:117] "RemoveContainer" containerID="0dfd6374652a02df264a081a6853759e908e8426ce7b5e274a031b6b96f0c01a" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.809358 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-45twr"] Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.818480 4848 scope.go:117] "RemoveContainer" containerID="ec95d1591dc1890ddce80c4050574469fa8ba0f01b46c3320dad3b29f8da4f1f" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.825074 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-45twr"] Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.835283 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9rm9"] Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.838754 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j9rm9"] Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.841576 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8rb4g"] Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.844253 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8rb4g"] Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.847889 4848 scope.go:117] "RemoveContainer" containerID="a94d3b73f2531fafe04b2d4e95713cc1de43392f761af2a548ffa9c2ce941e6f" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.868085 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4wt92"] Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.876784 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4wt92"] Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.882700 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mq695"] Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.886219 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-mq695"] Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.889448 4848 scope.go:117] "RemoveContainer" containerID="016c29e3e70c33bfad969b7a77460d8ed18f621e0c1997b2e20c3e1d1517a198" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.912418 4848 scope.go:117] "RemoveContainer" containerID="9cd61d8f24fd187fbe89d4894eccc26dcf241c909879449d0d181ef6389f5352" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.928953 4848 scope.go:117] "RemoveContainer" containerID="f65d32f0f84f3013e2bbf6865a1cd0abb59d8b680ec7c5a9d9db009a7d9b42a4" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.943981 4848 scope.go:117] "RemoveContainer" containerID="5fba36ff660c78451338f51cde4a26f30f2655f77c1f5f7663b7388d93c29793" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.959820 4848 scope.go:117] "RemoveContainer" containerID="cf987ef05be8d4c0f9e3ff66a958ba259bcb73a5c640f68afc9026b1bdb17c17" Feb 17 09:10:07 crc kubenswrapper[4848]: I0217 09:10:07.981668 4848 scope.go:117] "RemoveContainer" containerID="23c7a51a08b983d84c8f2d19c1860f417a1f6583e065fe1609b0e1e2c0dd98e6" Feb 17 09:10:08 crc kubenswrapper[4848]: I0217 09:10:08.545053 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 09:10:08 crc kubenswrapper[4848]: I0217 09:10:08.545146 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 09:10:08 crc kubenswrapper[4848]: I0217 09:10:08.672996 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 09:10:08 crc kubenswrapper[4848]: I0217 09:10:08.673459 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 09:10:08 crc kubenswrapper[4848]: I0217 09:10:08.673506 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 09:10:08 crc kubenswrapper[4848]: I0217 09:10:08.673160 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:10:08 crc kubenswrapper[4848]: I0217 09:10:08.673613 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 09:10:08 crc kubenswrapper[4848]: I0217 09:10:08.673673 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:10:08 crc kubenswrapper[4848]: I0217 09:10:08.673679 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 17 09:10:08 crc kubenswrapper[4848]: I0217 09:10:08.673753 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:10:08 crc kubenswrapper[4848]: I0217 09:10:08.673777 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:10:08 crc kubenswrapper[4848]: I0217 09:10:08.674283 4848 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 09:10:08 crc kubenswrapper[4848]: I0217 09:10:08.674320 4848 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 17 09:10:08 crc kubenswrapper[4848]: I0217 09:10:08.674339 4848 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 17 09:10:08 crc kubenswrapper[4848]: I0217 09:10:08.674361 4848 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 17 09:10:08 crc kubenswrapper[4848]: I0217 09:10:08.685127 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:10:08 crc kubenswrapper[4848]: I0217 09:10:08.776907 4848 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 17 09:10:08 crc kubenswrapper[4848]: I0217 09:10:08.781458 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 17 09:10:08 crc kubenswrapper[4848]: I0217 09:10:08.781518 4848 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="d0a3286cde90725c0d71910e02c7feb577c8a1072021cc1b2fb4447db429f3a6" exitCode=137 Feb 17 09:10:08 crc kubenswrapper[4848]: I0217 09:10:08.781559 4848 scope.go:117] "RemoveContainer" containerID="d0a3286cde90725c0d71910e02c7feb577c8a1072021cc1b2fb4447db429f3a6" Feb 17 09:10:08 crc kubenswrapper[4848]: I0217 09:10:08.781605 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 17 09:10:08 crc kubenswrapper[4848]: I0217 09:10:08.809223 4848 scope.go:117] "RemoveContainer" containerID="d0a3286cde90725c0d71910e02c7feb577c8a1072021cc1b2fb4447db429f3a6" Feb 17 09:10:08 crc kubenswrapper[4848]: E0217 09:10:08.809737 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0a3286cde90725c0d71910e02c7feb577c8a1072021cc1b2fb4447db429f3a6\": container with ID starting with d0a3286cde90725c0d71910e02c7feb577c8a1072021cc1b2fb4447db429f3a6 not found: ID does not exist" containerID="d0a3286cde90725c0d71910e02c7feb577c8a1072021cc1b2fb4447db429f3a6" Feb 17 09:10:08 crc kubenswrapper[4848]: I0217 09:10:08.809807 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0a3286cde90725c0d71910e02c7feb577c8a1072021cc1b2fb4447db429f3a6"} err="failed to get container status \"d0a3286cde90725c0d71910e02c7feb577c8a1072021cc1b2fb4447db429f3a6\": rpc error: code = NotFound desc = could not find container \"d0a3286cde90725c0d71910e02c7feb577c8a1072021cc1b2fb4447db429f3a6\": container with ID starting with d0a3286cde90725c0d71910e02c7feb577c8a1072021cc1b2fb4447db429f3a6 not found: ID does not exist" Feb 17 09:10:09 crc kubenswrapper[4848]: I0217 09:10:09.390321 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ac4713-0dcb-4908-8063-b6e029c132d7" path="/var/lib/kubelet/pods/09ac4713-0dcb-4908-8063-b6e029c132d7/volumes" Feb 17 09:10:09 crc kubenswrapper[4848]: I0217 09:10:09.391043 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38529e93-75d3-4b08-a3dc-939fab0cbf66" path="/var/lib/kubelet/pods/38529e93-75d3-4b08-a3dc-939fab0cbf66/volumes" Feb 17 09:10:09 crc kubenswrapper[4848]: I0217 09:10:09.391632 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3afb3ce4-f468-4042-b4d5-61285893e7e1" path="/var/lib/kubelet/pods/3afb3ce4-f468-4042-b4d5-61285893e7e1/volumes" Feb 17 09:10:09 crc kubenswrapper[4848]: I0217 09:10:09.392559 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3a74de7-62f1-46c2-b518-6baa8b222b1b" path="/var/lib/kubelet/pods/b3a74de7-62f1-46c2-b518-6baa8b222b1b/volumes" Feb 17 09:10:09 crc kubenswrapper[4848]: I0217 09:10:09.393107 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f224f79c-f6d1-442a-be23-4fc8e7527d3a" path="/var/lib/kubelet/pods/f224f79c-f6d1-442a-be23-4fc8e7527d3a/volumes" Feb 17 09:10:09 crc kubenswrapper[4848]: I0217 09:10:09.394032 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 17 09:10:09 crc kubenswrapper[4848]: E0217 09:10:09.991284 4848 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 17 09:10:09 crc kubenswrapper[4848]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-xvh8l_openshift-marketplace_cd7df0ba-ff8c-48ce-ad07-8ac50003f318_0(dd86bafd67862ce49e7a1b2a436ba97ae6e096ea8811688d7809b1ef70f8fffa): error adding pod openshift-marketplace_marketplace-operator-79b997595-xvh8l to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"dd86bafd67862ce49e7a1b2a436ba97ae6e096ea8811688d7809b1ef70f8fffa" Netns:"/var/run/netns/bdbf434b-0745-44f9-b792-749ae857a403" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-xvh8l;K8S_POD_INFRA_CONTAINER_ID=dd86bafd67862ce49e7a1b2a436ba97ae6e096ea8811688d7809b1ef70f8fffa;K8S_POD_UID=cd7df0ba-ff8c-48ce-ad07-8ac50003f318" Path:"" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-xvh8l] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-xvh8l/cd7df0ba-ff8c-48ce-ad07-8ac50003f318]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod marketplace-operator-79b997595-xvh8l in out of cluster comm: pod "marketplace-operator-79b997595-xvh8l" not found Feb 17 09:10:09 crc kubenswrapper[4848]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 17 09:10:09 crc kubenswrapper[4848]: > Feb 17 09:10:09 crc kubenswrapper[4848]: E0217 09:10:09.991586 4848 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 17 09:10:09 crc kubenswrapper[4848]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-xvh8l_openshift-marketplace_cd7df0ba-ff8c-48ce-ad07-8ac50003f318_0(dd86bafd67862ce49e7a1b2a436ba97ae6e096ea8811688d7809b1ef70f8fffa): error adding pod openshift-marketplace_marketplace-operator-79b997595-xvh8l to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"dd86bafd67862ce49e7a1b2a436ba97ae6e096ea8811688d7809b1ef70f8fffa" Netns:"/var/run/netns/bdbf434b-0745-44f9-b792-749ae857a403" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-xvh8l;K8S_POD_INFRA_CONTAINER_ID=dd86bafd67862ce49e7a1b2a436ba97ae6e096ea8811688d7809b1ef70f8fffa;K8S_POD_UID=cd7df0ba-ff8c-48ce-ad07-8ac50003f318" Path:"" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-xvh8l] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-xvh8l/cd7df0ba-ff8c-48ce-ad07-8ac50003f318]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod marketplace-operator-79b997595-xvh8l in out of cluster comm: pod "marketplace-operator-79b997595-xvh8l" not found Feb 17 09:10:09 crc kubenswrapper[4848]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 17 09:10:09 crc kubenswrapper[4848]: > pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" Feb 17 09:10:09 crc kubenswrapper[4848]: E0217 09:10:09.991610 4848 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 17 09:10:09 crc kubenswrapper[4848]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-xvh8l_openshift-marketplace_cd7df0ba-ff8c-48ce-ad07-8ac50003f318_0(dd86bafd67862ce49e7a1b2a436ba97ae6e096ea8811688d7809b1ef70f8fffa): error adding pod openshift-marketplace_marketplace-operator-79b997595-xvh8l to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"dd86bafd67862ce49e7a1b2a436ba97ae6e096ea8811688d7809b1ef70f8fffa" Netns:"/var/run/netns/bdbf434b-0745-44f9-b792-749ae857a403" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-xvh8l;K8S_POD_INFRA_CONTAINER_ID=dd86bafd67862ce49e7a1b2a436ba97ae6e096ea8811688d7809b1ef70f8fffa;K8S_POD_UID=cd7df0ba-ff8c-48ce-ad07-8ac50003f318" Path:"" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-xvh8l] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-xvh8l/cd7df0ba-ff8c-48ce-ad07-8ac50003f318]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod marketplace-operator-79b997595-xvh8l in out of cluster comm: pod "marketplace-operator-79b997595-xvh8l" not found Feb 17 09:10:09 crc kubenswrapper[4848]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 17 09:10:09 crc kubenswrapper[4848]: > pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" Feb 17 09:10:09 crc kubenswrapper[4848]: E0217 09:10:09.991670 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"marketplace-operator-79b997595-xvh8l_openshift-marketplace(cd7df0ba-ff8c-48ce-ad07-8ac50003f318)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"marketplace-operator-79b997595-xvh8l_openshift-marketplace(cd7df0ba-ff8c-48ce-ad07-8ac50003f318)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-xvh8l_openshift-marketplace_cd7df0ba-ff8c-48ce-ad07-8ac50003f318_0(dd86bafd67862ce49e7a1b2a436ba97ae6e096ea8811688d7809b1ef70f8fffa): error adding pod openshift-marketplace_marketplace-operator-79b997595-xvh8l to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"dd86bafd67862ce49e7a1b2a436ba97ae6e096ea8811688d7809b1ef70f8fffa\\\" Netns:\\\"/var/run/netns/bdbf434b-0745-44f9-b792-749ae857a403\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-xvh8l;K8S_POD_INFRA_CONTAINER_ID=dd86bafd67862ce49e7a1b2a436ba97ae6e096ea8811688d7809b1ef70f8fffa;K8S_POD_UID=cd7df0ba-ff8c-48ce-ad07-8ac50003f318\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-xvh8l] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-xvh8l/cd7df0ba-ff8c-48ce-ad07-8ac50003f318]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod marketplace-operator-79b997595-xvh8l in out of cluster comm: pod \\\"marketplace-operator-79b997595-xvh8l\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" podUID="cd7df0ba-ff8c-48ce-ad07-8ac50003f318" Feb 17 09:10:10 crc kubenswrapper[4848]: I0217 09:10:10.795433 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" Feb 17 09:10:10 crc kubenswrapper[4848]: I0217 09:10:10.796542 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" Feb 17 09:10:14 crc kubenswrapper[4848]: E0217 09:10:14.006454 4848 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 17 09:10:14 crc kubenswrapper[4848]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-xvh8l_openshift-marketplace_cd7df0ba-ff8c-48ce-ad07-8ac50003f318_0(c389e401935a70f22ec84a0021e7bdd96021c42de8f75c7426768659de29e3b1): error adding pod openshift-marketplace_marketplace-operator-79b997595-xvh8l to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c389e401935a70f22ec84a0021e7bdd96021c42de8f75c7426768659de29e3b1" Netns:"/var/run/netns/7a49480c-8aed-4798-8904-f4f8f62d199a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-xvh8l;K8S_POD_INFRA_CONTAINER_ID=c389e401935a70f22ec84a0021e7bdd96021c42de8f75c7426768659de29e3b1;K8S_POD_UID=cd7df0ba-ff8c-48ce-ad07-8ac50003f318" Path:"" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-xvh8l] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-xvh8l/cd7df0ba-ff8c-48ce-ad07-8ac50003f318]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod marketplace-operator-79b997595-xvh8l in out of cluster comm: pod "marketplace-operator-79b997595-xvh8l" not found Feb 17 09:10:14 crc kubenswrapper[4848]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 17 09:10:14 crc kubenswrapper[4848]: > Feb 17 09:10:14 crc kubenswrapper[4848]: E0217 09:10:14.006830 4848 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 17 09:10:14 crc kubenswrapper[4848]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-xvh8l_openshift-marketplace_cd7df0ba-ff8c-48ce-ad07-8ac50003f318_0(c389e401935a70f22ec84a0021e7bdd96021c42de8f75c7426768659de29e3b1): error adding pod openshift-marketplace_marketplace-operator-79b997595-xvh8l to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c389e401935a70f22ec84a0021e7bdd96021c42de8f75c7426768659de29e3b1" Netns:"/var/run/netns/7a49480c-8aed-4798-8904-f4f8f62d199a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-xvh8l;K8S_POD_INFRA_CONTAINER_ID=c389e401935a70f22ec84a0021e7bdd96021c42de8f75c7426768659de29e3b1;K8S_POD_UID=cd7df0ba-ff8c-48ce-ad07-8ac50003f318" Path:"" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-xvh8l] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-xvh8l/cd7df0ba-ff8c-48ce-ad07-8ac50003f318]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod marketplace-operator-79b997595-xvh8l in out of cluster comm: pod "marketplace-operator-79b997595-xvh8l" not found Feb 17 09:10:14 crc kubenswrapper[4848]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 17 09:10:14 crc kubenswrapper[4848]: > pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" Feb 17 09:10:14 crc kubenswrapper[4848]: E0217 09:10:14.006856 4848 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Feb 17 09:10:14 crc kubenswrapper[4848]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-xvh8l_openshift-marketplace_cd7df0ba-ff8c-48ce-ad07-8ac50003f318_0(c389e401935a70f22ec84a0021e7bdd96021c42de8f75c7426768659de29e3b1): error adding pod openshift-marketplace_marketplace-operator-79b997595-xvh8l to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"c389e401935a70f22ec84a0021e7bdd96021c42de8f75c7426768659de29e3b1" Netns:"/var/run/netns/7a49480c-8aed-4798-8904-f4f8f62d199a" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-xvh8l;K8S_POD_INFRA_CONTAINER_ID=c389e401935a70f22ec84a0021e7bdd96021c42de8f75c7426768659de29e3b1;K8S_POD_UID=cd7df0ba-ff8c-48ce-ad07-8ac50003f318" Path:"" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-xvh8l] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-xvh8l/cd7df0ba-ff8c-48ce-ad07-8ac50003f318]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod marketplace-operator-79b997595-xvh8l in out of cluster comm: pod "marketplace-operator-79b997595-xvh8l" not found Feb 17 09:10:14 crc kubenswrapper[4848]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 17 09:10:14 crc kubenswrapper[4848]: > pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" Feb 17 09:10:14 crc kubenswrapper[4848]: E0217 09:10:14.006935 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"marketplace-operator-79b997595-xvh8l_openshift-marketplace(cd7df0ba-ff8c-48ce-ad07-8ac50003f318)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"marketplace-operator-79b997595-xvh8l_openshift-marketplace(cd7df0ba-ff8c-48ce-ad07-8ac50003f318)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_marketplace-operator-79b997595-xvh8l_openshift-marketplace_cd7df0ba-ff8c-48ce-ad07-8ac50003f318_0(c389e401935a70f22ec84a0021e7bdd96021c42de8f75c7426768659de29e3b1): error adding pod openshift-marketplace_marketplace-operator-79b997595-xvh8l to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"c389e401935a70f22ec84a0021e7bdd96021c42de8f75c7426768659de29e3b1\\\" Netns:\\\"/var/run/netns/7a49480c-8aed-4798-8904-f4f8f62d199a\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-marketplace;K8S_POD_NAME=marketplace-operator-79b997595-xvh8l;K8S_POD_INFRA_CONTAINER_ID=c389e401935a70f22ec84a0021e7bdd96021c42de8f75c7426768659de29e3b1;K8S_POD_UID=cd7df0ba-ff8c-48ce-ad07-8ac50003f318\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-marketplace/marketplace-operator-79b997595-xvh8l] networking: Multus: [openshift-marketplace/marketplace-operator-79b997595-xvh8l/cd7df0ba-ff8c-48ce-ad07-8ac50003f318]: error setting the networks status, pod was already deleted: SetPodNetworkStatusAnnotation: failed to query the pod marketplace-operator-79b997595-xvh8l in out of cluster comm: pod \\\"marketplace-operator-79b997595-xvh8l\\\" not found\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" podUID="cd7df0ba-ff8c-48ce-ad07-8ac50003f318" Feb 17 09:10:15 crc kubenswrapper[4848]: I0217 09:10:15.543276 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 17 09:10:17 crc kubenswrapper[4848]: I0217 09:10:17.607592 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 17 09:10:18 crc kubenswrapper[4848]: I0217 09:10:18.702275 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 17 09:10:20 crc kubenswrapper[4848]: I0217 09:10:20.929620 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 17 09:10:21 crc kubenswrapper[4848]: I0217 09:10:21.140036 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 17 09:10:23 crc kubenswrapper[4848]: I0217 09:10:23.061076 4848 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Feb 17 09:10:25 crc kubenswrapper[4848]: I0217 09:10:25.538145 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 17 09:10:26 crc kubenswrapper[4848]: I0217 09:10:26.007494 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 17 09:10:27 crc kubenswrapper[4848]: I0217 09:10:27.569239 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 17 09:10:28 crc kubenswrapper[4848]: I0217 09:10:28.382683 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" Feb 17 09:10:28 crc kubenswrapper[4848]: I0217 09:10:28.383186 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" Feb 17 09:10:28 crc kubenswrapper[4848]: I0217 09:10:28.709846 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 17 09:10:28 crc kubenswrapper[4848]: I0217 09:10:28.832348 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-xvh8l"] Feb 17 09:10:29 crc kubenswrapper[4848]: I0217 09:10:29.086320 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" event={"ID":"cd7df0ba-ff8c-48ce-ad07-8ac50003f318","Type":"ContainerStarted","Data":"73be34da18c21c306d8298256d81a846a83e3defe2e5a4214dbcd1536090b369"} Feb 17 09:10:29 crc kubenswrapper[4848]: I0217 09:10:29.086374 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" event={"ID":"cd7df0ba-ff8c-48ce-ad07-8ac50003f318","Type":"ContainerStarted","Data":"a0f0c63180b01026add435a7dea6beff5e0a5ba2d2ef4424b93dae6e1a7240ba"} Feb 17 09:10:29 crc kubenswrapper[4848]: I0217 09:10:29.086795 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" Feb 17 09:10:29 crc kubenswrapper[4848]: I0217 09:10:29.087839 4848 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-xvh8l container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.66:8080/healthz\": dial tcp 10.217.0.66:8080: connect: connection refused" start-of-body= Feb 17 09:10:29 crc kubenswrapper[4848]: I0217 09:10:29.087882 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" podUID="cd7df0ba-ff8c-48ce-ad07-8ac50003f318" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.66:8080/healthz\": dial tcp 10.217.0.66:8080: connect: connection refused" Feb 17 09:10:29 crc kubenswrapper[4848]: I0217 09:10:29.115933 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" podStartSLOduration=23.115915334 podStartE2EDuration="23.115915334s" podCreationTimestamp="2026-02-17 09:10:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:10:29.112710346 +0000 UTC m=+306.655965992" watchObservedRunningTime="2026-02-17 09:10:29.115915334 +0000 UTC m=+306.659170980" Feb 17 09:10:30 crc kubenswrapper[4848]: I0217 09:10:30.096476 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-xvh8l" Feb 17 09:10:32 crc kubenswrapper[4848]: I0217 09:10:32.222170 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 17 09:10:33 crc kubenswrapper[4848]: I0217 09:10:33.631864 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 17 09:10:33 crc kubenswrapper[4848]: I0217 09:10:33.642799 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 17 09:10:40 crc kubenswrapper[4848]: I0217 09:10:40.834694 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 17 09:10:40 crc kubenswrapper[4848]: I0217 09:10:40.847017 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.534309 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-m54wv"] Feb 17 09:10:50 crc kubenswrapper[4848]: E0217 09:10:50.535144 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38529e93-75d3-4b08-a3dc-939fab0cbf66" containerName="extract-utilities" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.535165 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="38529e93-75d3-4b08-a3dc-939fab0cbf66" containerName="extract-utilities" Feb 17 09:10:50 crc kubenswrapper[4848]: E0217 09:10:50.535186 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f224f79c-f6d1-442a-be23-4fc8e7527d3a" containerName="extract-content" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.535199 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f224f79c-f6d1-442a-be23-4fc8e7527d3a" containerName="extract-content" Feb 17 09:10:50 crc kubenswrapper[4848]: E0217 09:10:50.535220 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f224f79c-f6d1-442a-be23-4fc8e7527d3a" containerName="registry-server" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.535233 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f224f79c-f6d1-442a-be23-4fc8e7527d3a" containerName="registry-server" Feb 17 09:10:50 crc kubenswrapper[4848]: E0217 09:10:50.535251 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09ac4713-0dcb-4908-8063-b6e029c132d7" containerName="extract-content" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.535264 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="09ac4713-0dcb-4908-8063-b6e029c132d7" containerName="extract-content" Feb 17 09:10:50 crc kubenswrapper[4848]: E0217 09:10:50.535282 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3a74de7-62f1-46c2-b518-6baa8b222b1b" containerName="extract-content" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.535294 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3a74de7-62f1-46c2-b518-6baa8b222b1b" containerName="extract-content" Feb 17 09:10:50 crc kubenswrapper[4848]: E0217 09:10:50.535312 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38529e93-75d3-4b08-a3dc-939fab0cbf66" containerName="extract-content" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.535323 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="38529e93-75d3-4b08-a3dc-939fab0cbf66" containerName="extract-content" Feb 17 09:10:50 crc kubenswrapper[4848]: E0217 09:10:50.535338 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f224f79c-f6d1-442a-be23-4fc8e7527d3a" containerName="extract-utilities" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.535351 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f224f79c-f6d1-442a-be23-4fc8e7527d3a" containerName="extract-utilities" Feb 17 09:10:50 crc kubenswrapper[4848]: E0217 09:10:50.535366 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38529e93-75d3-4b08-a3dc-939fab0cbf66" containerName="registry-server" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.535378 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="38529e93-75d3-4b08-a3dc-939fab0cbf66" containerName="registry-server" Feb 17 09:10:50 crc kubenswrapper[4848]: E0217 09:10:50.535394 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3afb3ce4-f468-4042-b4d5-61285893e7e1" containerName="marketplace-operator" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.535405 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="3afb3ce4-f468-4042-b4d5-61285893e7e1" containerName="marketplace-operator" Feb 17 09:10:50 crc kubenswrapper[4848]: E0217 09:10:50.535421 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3a74de7-62f1-46c2-b518-6baa8b222b1b" containerName="registry-server" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.535434 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3a74de7-62f1-46c2-b518-6baa8b222b1b" containerName="registry-server" Feb 17 09:10:50 crc kubenswrapper[4848]: E0217 09:10:50.535449 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09ac4713-0dcb-4908-8063-b6e029c132d7" containerName="extract-utilities" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.535461 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="09ac4713-0dcb-4908-8063-b6e029c132d7" containerName="extract-utilities" Feb 17 09:10:50 crc kubenswrapper[4848]: E0217 09:10:50.535479 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09ac4713-0dcb-4908-8063-b6e029c132d7" containerName="registry-server" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.535491 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="09ac4713-0dcb-4908-8063-b6e029c132d7" containerName="registry-server" Feb 17 09:10:50 crc kubenswrapper[4848]: E0217 09:10:50.535504 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3a74de7-62f1-46c2-b518-6baa8b222b1b" containerName="extract-utilities" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.535516 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3a74de7-62f1-46c2-b518-6baa8b222b1b" containerName="extract-utilities" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.535662 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="3afb3ce4-f468-4042-b4d5-61285893e7e1" containerName="marketplace-operator" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.535686 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f224f79c-f6d1-442a-be23-4fc8e7527d3a" containerName="registry-server" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.535712 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="09ac4713-0dcb-4908-8063-b6e029c132d7" containerName="registry-server" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.535724 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="38529e93-75d3-4b08-a3dc-939fab0cbf66" containerName="registry-server" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.535742 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3a74de7-62f1-46c2-b518-6baa8b222b1b" containerName="registry-server" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.536950 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m54wv" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.548432 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.553386 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m54wv"] Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.616095 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31b2644d-739b-4457-a3cc-c30d6b116423-utilities\") pod \"redhat-operators-m54wv\" (UID: \"31b2644d-739b-4457-a3cc-c30d6b116423\") " pod="openshift-marketplace/redhat-operators-m54wv" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.616221 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31b2644d-739b-4457-a3cc-c30d6b116423-catalog-content\") pod \"redhat-operators-m54wv\" (UID: \"31b2644d-739b-4457-a3cc-c30d6b116423\") " pod="openshift-marketplace/redhat-operators-m54wv" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.616294 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2bpw\" (UniqueName: \"kubernetes.io/projected/31b2644d-739b-4457-a3cc-c30d6b116423-kube-api-access-b2bpw\") pod \"redhat-operators-m54wv\" (UID: \"31b2644d-739b-4457-a3cc-c30d6b116423\") " pod="openshift-marketplace/redhat-operators-m54wv" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.717843 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31b2644d-739b-4457-a3cc-c30d6b116423-utilities\") pod \"redhat-operators-m54wv\" (UID: \"31b2644d-739b-4457-a3cc-c30d6b116423\") " pod="openshift-marketplace/redhat-operators-m54wv" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.717970 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31b2644d-739b-4457-a3cc-c30d6b116423-catalog-content\") pod \"redhat-operators-m54wv\" (UID: \"31b2644d-739b-4457-a3cc-c30d6b116423\") " pod="openshift-marketplace/redhat-operators-m54wv" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.718049 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2bpw\" (UniqueName: \"kubernetes.io/projected/31b2644d-739b-4457-a3cc-c30d6b116423-kube-api-access-b2bpw\") pod \"redhat-operators-m54wv\" (UID: \"31b2644d-739b-4457-a3cc-c30d6b116423\") " pod="openshift-marketplace/redhat-operators-m54wv" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.718751 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31b2644d-739b-4457-a3cc-c30d6b116423-utilities\") pod \"redhat-operators-m54wv\" (UID: \"31b2644d-739b-4457-a3cc-c30d6b116423\") " pod="openshift-marketplace/redhat-operators-m54wv" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.719200 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31b2644d-739b-4457-a3cc-c30d6b116423-catalog-content\") pod \"redhat-operators-m54wv\" (UID: \"31b2644d-739b-4457-a3cc-c30d6b116423\") " pod="openshift-marketplace/redhat-operators-m54wv" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.724099 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vnl45"] Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.725629 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vnl45" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.730021 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.737502 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vnl45"] Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.755968 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2bpw\" (UniqueName: \"kubernetes.io/projected/31b2644d-739b-4457-a3cc-c30d6b116423-kube-api-access-b2bpw\") pod \"redhat-operators-m54wv\" (UID: \"31b2644d-739b-4457-a3cc-c30d6b116423\") " pod="openshift-marketplace/redhat-operators-m54wv" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.819278 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01-utilities\") pod \"redhat-marketplace-vnl45\" (UID: \"dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01\") " pod="openshift-marketplace/redhat-marketplace-vnl45" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.819334 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nr8c\" (UniqueName: \"kubernetes.io/projected/dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01-kube-api-access-7nr8c\") pod \"redhat-marketplace-vnl45\" (UID: \"dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01\") " pod="openshift-marketplace/redhat-marketplace-vnl45" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.819376 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01-catalog-content\") pod \"redhat-marketplace-vnl45\" (UID: \"dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01\") " pod="openshift-marketplace/redhat-marketplace-vnl45" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.862533 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-m54wv" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.921165 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01-catalog-content\") pod \"redhat-marketplace-vnl45\" (UID: \"dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01\") " pod="openshift-marketplace/redhat-marketplace-vnl45" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.922296 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01-catalog-content\") pod \"redhat-marketplace-vnl45\" (UID: \"dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01\") " pod="openshift-marketplace/redhat-marketplace-vnl45" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.923107 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01-utilities\") pod \"redhat-marketplace-vnl45\" (UID: \"dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01\") " pod="openshift-marketplace/redhat-marketplace-vnl45" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.923191 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nr8c\" (UniqueName: \"kubernetes.io/projected/dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01-kube-api-access-7nr8c\") pod \"redhat-marketplace-vnl45\" (UID: \"dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01\") " pod="openshift-marketplace/redhat-marketplace-vnl45" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.923906 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01-utilities\") pod \"redhat-marketplace-vnl45\" (UID: \"dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01\") " pod="openshift-marketplace/redhat-marketplace-vnl45" Feb 17 09:10:50 crc kubenswrapper[4848]: I0217 09:10:50.948434 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nr8c\" (UniqueName: \"kubernetes.io/projected/dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01-kube-api-access-7nr8c\") pod \"redhat-marketplace-vnl45\" (UID: \"dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01\") " pod="openshift-marketplace/redhat-marketplace-vnl45" Feb 17 09:10:51 crc kubenswrapper[4848]: I0217 09:10:51.045164 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vnl45" Feb 17 09:10:51 crc kubenswrapper[4848]: I0217 09:10:51.274448 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vnl45"] Feb 17 09:10:51 crc kubenswrapper[4848]: I0217 09:10:51.345257 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-m54wv"] Feb 17 09:10:51 crc kubenswrapper[4848]: W0217 09:10:51.349224 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31b2644d_739b_4457_a3cc_c30d6b116423.slice/crio-17ae38f5631003c502aea839beb64f28178d747eacf1ee78af803d6d162f40ac WatchSource:0}: Error finding container 17ae38f5631003c502aea839beb64f28178d747eacf1ee78af803d6d162f40ac: Status 404 returned error can't find the container with id 17ae38f5631003c502aea839beb64f28178d747eacf1ee78af803d6d162f40ac Feb 17 09:10:52 crc kubenswrapper[4848]: I0217 09:10:52.223412 4848 generic.go:334] "Generic (PLEG): container finished" podID="dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01" containerID="5d1c4515361c45988d62c20f44b30ca926b3250a4ad359a3aecb675a3389d69d" exitCode=0 Feb 17 09:10:52 crc kubenswrapper[4848]: I0217 09:10:52.223506 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnl45" event={"ID":"dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01","Type":"ContainerDied","Data":"5d1c4515361c45988d62c20f44b30ca926b3250a4ad359a3aecb675a3389d69d"} Feb 17 09:10:52 crc kubenswrapper[4848]: I0217 09:10:52.223620 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnl45" event={"ID":"dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01","Type":"ContainerStarted","Data":"52b9f0527344513f0f09b163dfd328014764c7adf190a00717a4fa2bec5f525b"} Feb 17 09:10:52 crc kubenswrapper[4848]: I0217 09:10:52.226698 4848 generic.go:334] "Generic (PLEG): container finished" podID="31b2644d-739b-4457-a3cc-c30d6b116423" containerID="87b9fd66695d40b7c5dc8bc265bedceb8bc98e925a534428e264f7bbd98c7e82" exitCode=0 Feb 17 09:10:52 crc kubenswrapper[4848]: I0217 09:10:52.226779 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m54wv" event={"ID":"31b2644d-739b-4457-a3cc-c30d6b116423","Type":"ContainerDied","Data":"87b9fd66695d40b7c5dc8bc265bedceb8bc98e925a534428e264f7bbd98c7e82"} Feb 17 09:10:52 crc kubenswrapper[4848]: I0217 09:10:52.226818 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m54wv" event={"ID":"31b2644d-739b-4457-a3cc-c30d6b116423","Type":"ContainerStarted","Data":"17ae38f5631003c502aea839beb64f28178d747eacf1ee78af803d6d162f40ac"} Feb 17 09:10:52 crc kubenswrapper[4848]: I0217 09:10:52.524917 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-d4ccz"] Feb 17 09:10:52 crc kubenswrapper[4848]: I0217 09:10:52.526629 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d4ccz" Feb 17 09:10:52 crc kubenswrapper[4848]: I0217 09:10:52.528590 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 17 09:10:52 crc kubenswrapper[4848]: I0217 09:10:52.532449 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d4ccz"] Feb 17 09:10:52 crc kubenswrapper[4848]: I0217 09:10:52.647277 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f36cf8f9-f4fd-456f-80ce-c4d68f71273b-utilities\") pod \"community-operators-d4ccz\" (UID: \"f36cf8f9-f4fd-456f-80ce-c4d68f71273b\") " pod="openshift-marketplace/community-operators-d4ccz" Feb 17 09:10:52 crc kubenswrapper[4848]: I0217 09:10:52.647356 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f36cf8f9-f4fd-456f-80ce-c4d68f71273b-catalog-content\") pod \"community-operators-d4ccz\" (UID: \"f36cf8f9-f4fd-456f-80ce-c4d68f71273b\") " pod="openshift-marketplace/community-operators-d4ccz" Feb 17 09:10:52 crc kubenswrapper[4848]: I0217 09:10:52.647454 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwk8c\" (UniqueName: \"kubernetes.io/projected/f36cf8f9-f4fd-456f-80ce-c4d68f71273b-kube-api-access-lwk8c\") pod \"community-operators-d4ccz\" (UID: \"f36cf8f9-f4fd-456f-80ce-c4d68f71273b\") " pod="openshift-marketplace/community-operators-d4ccz" Feb 17 09:10:52 crc kubenswrapper[4848]: I0217 09:10:52.748830 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f36cf8f9-f4fd-456f-80ce-c4d68f71273b-utilities\") pod \"community-operators-d4ccz\" (UID: \"f36cf8f9-f4fd-456f-80ce-c4d68f71273b\") " pod="openshift-marketplace/community-operators-d4ccz" Feb 17 09:10:52 crc kubenswrapper[4848]: I0217 09:10:52.748999 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f36cf8f9-f4fd-456f-80ce-c4d68f71273b-catalog-content\") pod \"community-operators-d4ccz\" (UID: \"f36cf8f9-f4fd-456f-80ce-c4d68f71273b\") " pod="openshift-marketplace/community-operators-d4ccz" Feb 17 09:10:52 crc kubenswrapper[4848]: I0217 09:10:52.749113 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwk8c\" (UniqueName: \"kubernetes.io/projected/f36cf8f9-f4fd-456f-80ce-c4d68f71273b-kube-api-access-lwk8c\") pod \"community-operators-d4ccz\" (UID: \"f36cf8f9-f4fd-456f-80ce-c4d68f71273b\") " pod="openshift-marketplace/community-operators-d4ccz" Feb 17 09:10:52 crc kubenswrapper[4848]: I0217 09:10:52.749544 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f36cf8f9-f4fd-456f-80ce-c4d68f71273b-utilities\") pod \"community-operators-d4ccz\" (UID: \"f36cf8f9-f4fd-456f-80ce-c4d68f71273b\") " pod="openshift-marketplace/community-operators-d4ccz" Feb 17 09:10:52 crc kubenswrapper[4848]: I0217 09:10:52.749727 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f36cf8f9-f4fd-456f-80ce-c4d68f71273b-catalog-content\") pod \"community-operators-d4ccz\" (UID: \"f36cf8f9-f4fd-456f-80ce-c4d68f71273b\") " pod="openshift-marketplace/community-operators-d4ccz" Feb 17 09:10:52 crc kubenswrapper[4848]: I0217 09:10:52.784019 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwk8c\" (UniqueName: \"kubernetes.io/projected/f36cf8f9-f4fd-456f-80ce-c4d68f71273b-kube-api-access-lwk8c\") pod \"community-operators-d4ccz\" (UID: \"f36cf8f9-f4fd-456f-80ce-c4d68f71273b\") " pod="openshift-marketplace/community-operators-d4ccz" Feb 17 09:10:52 crc kubenswrapper[4848]: I0217 09:10:52.845985 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-d4ccz" Feb 17 09:10:53 crc kubenswrapper[4848]: I0217 09:10:53.235508 4848 generic.go:334] "Generic (PLEG): container finished" podID="dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01" containerID="6c15702195f86ba88ed8ae20182c3ac545c26a5c684958fddf054822be02c3fe" exitCode=0 Feb 17 09:10:53 crc kubenswrapper[4848]: I0217 09:10:53.235613 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnl45" event={"ID":"dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01","Type":"ContainerDied","Data":"6c15702195f86ba88ed8ae20182c3ac545c26a5c684958fddf054822be02c3fe"} Feb 17 09:10:53 crc kubenswrapper[4848]: I0217 09:10:53.237982 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m54wv" event={"ID":"31b2644d-739b-4457-a3cc-c30d6b116423","Type":"ContainerStarted","Data":"d69b33a9adfbb7ece1edcb40ccc94b39c86e81f41fa66b81673a2fe671591118"} Feb 17 09:10:53 crc kubenswrapper[4848]: I0217 09:10:53.321750 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-d4ccz"] Feb 17 09:10:53 crc kubenswrapper[4848]: I0217 09:10:53.535936 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-thxdc"] Feb 17 09:10:53 crc kubenswrapper[4848]: I0217 09:10:53.537142 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-thxdc" Feb 17 09:10:53 crc kubenswrapper[4848]: I0217 09:10:53.543108 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-thxdc"] Feb 17 09:10:53 crc kubenswrapper[4848]: I0217 09:10:53.584053 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 17 09:10:53 crc kubenswrapper[4848]: I0217 09:10:53.683853 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42e5e859-f703-486e-9d9b-06660c9d3e51-utilities\") pod \"certified-operators-thxdc\" (UID: \"42e5e859-f703-486e-9d9b-06660c9d3e51\") " pod="openshift-marketplace/certified-operators-thxdc" Feb 17 09:10:53 crc kubenswrapper[4848]: I0217 09:10:53.683898 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42e5e859-f703-486e-9d9b-06660c9d3e51-catalog-content\") pod \"certified-operators-thxdc\" (UID: \"42e5e859-f703-486e-9d9b-06660c9d3e51\") " pod="openshift-marketplace/certified-operators-thxdc" Feb 17 09:10:53 crc kubenswrapper[4848]: I0217 09:10:53.683953 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clv9k\" (UniqueName: \"kubernetes.io/projected/42e5e859-f703-486e-9d9b-06660c9d3e51-kube-api-access-clv9k\") pod \"certified-operators-thxdc\" (UID: \"42e5e859-f703-486e-9d9b-06660c9d3e51\") " pod="openshift-marketplace/certified-operators-thxdc" Feb 17 09:10:53 crc kubenswrapper[4848]: I0217 09:10:53.785237 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clv9k\" (UniqueName: \"kubernetes.io/projected/42e5e859-f703-486e-9d9b-06660c9d3e51-kube-api-access-clv9k\") pod \"certified-operators-thxdc\" (UID: \"42e5e859-f703-486e-9d9b-06660c9d3e51\") " pod="openshift-marketplace/certified-operators-thxdc" Feb 17 09:10:53 crc kubenswrapper[4848]: I0217 09:10:53.785321 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42e5e859-f703-486e-9d9b-06660c9d3e51-utilities\") pod \"certified-operators-thxdc\" (UID: \"42e5e859-f703-486e-9d9b-06660c9d3e51\") " pod="openshift-marketplace/certified-operators-thxdc" Feb 17 09:10:53 crc kubenswrapper[4848]: I0217 09:10:53.785354 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42e5e859-f703-486e-9d9b-06660c9d3e51-catalog-content\") pod \"certified-operators-thxdc\" (UID: \"42e5e859-f703-486e-9d9b-06660c9d3e51\") " pod="openshift-marketplace/certified-operators-thxdc" Feb 17 09:10:53 crc kubenswrapper[4848]: I0217 09:10:53.785921 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/42e5e859-f703-486e-9d9b-06660c9d3e51-catalog-content\") pod \"certified-operators-thxdc\" (UID: \"42e5e859-f703-486e-9d9b-06660c9d3e51\") " pod="openshift-marketplace/certified-operators-thxdc" Feb 17 09:10:53 crc kubenswrapper[4848]: I0217 09:10:53.786470 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/42e5e859-f703-486e-9d9b-06660c9d3e51-utilities\") pod \"certified-operators-thxdc\" (UID: \"42e5e859-f703-486e-9d9b-06660c9d3e51\") " pod="openshift-marketplace/certified-operators-thxdc" Feb 17 09:10:53 crc kubenswrapper[4848]: I0217 09:10:53.817288 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clv9k\" (UniqueName: \"kubernetes.io/projected/42e5e859-f703-486e-9d9b-06660c9d3e51-kube-api-access-clv9k\") pod \"certified-operators-thxdc\" (UID: \"42e5e859-f703-486e-9d9b-06660c9d3e51\") " pod="openshift-marketplace/certified-operators-thxdc" Feb 17 09:10:53 crc kubenswrapper[4848]: I0217 09:10:53.970637 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-thxdc" Feb 17 09:10:54 crc kubenswrapper[4848]: I0217 09:10:54.245356 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vnl45" event={"ID":"dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01","Type":"ContainerStarted","Data":"9838b9c8677e5f85756095b83adc9107ea0b2ec63716ccc0f5569f7448c7efdf"} Feb 17 09:10:54 crc kubenswrapper[4848]: I0217 09:10:54.247118 4848 generic.go:334] "Generic (PLEG): container finished" podID="f36cf8f9-f4fd-456f-80ce-c4d68f71273b" containerID="53d89391a2580cea699cd860f98db30a09d0fcf364c650796fb8a49e13231695" exitCode=0 Feb 17 09:10:54 crc kubenswrapper[4848]: I0217 09:10:54.247207 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4ccz" event={"ID":"f36cf8f9-f4fd-456f-80ce-c4d68f71273b","Type":"ContainerDied","Data":"53d89391a2580cea699cd860f98db30a09d0fcf364c650796fb8a49e13231695"} Feb 17 09:10:54 crc kubenswrapper[4848]: I0217 09:10:54.247258 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4ccz" event={"ID":"f36cf8f9-f4fd-456f-80ce-c4d68f71273b","Type":"ContainerStarted","Data":"faf4d479449462390abc9fa1f9754033ad8be755bd024c0918a9df74068faabf"} Feb 17 09:10:54 crc kubenswrapper[4848]: I0217 09:10:54.249134 4848 generic.go:334] "Generic (PLEG): container finished" podID="31b2644d-739b-4457-a3cc-c30d6b116423" containerID="d69b33a9adfbb7ece1edcb40ccc94b39c86e81f41fa66b81673a2fe671591118" exitCode=0 Feb 17 09:10:54 crc kubenswrapper[4848]: I0217 09:10:54.249181 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m54wv" event={"ID":"31b2644d-739b-4457-a3cc-c30d6b116423","Type":"ContainerDied","Data":"d69b33a9adfbb7ece1edcb40ccc94b39c86e81f41fa66b81673a2fe671591118"} Feb 17 09:10:54 crc kubenswrapper[4848]: I0217 09:10:54.273578 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vnl45" podStartSLOduration=2.774438479 podStartE2EDuration="4.27355888s" podCreationTimestamp="2026-02-17 09:10:50 +0000 UTC" firstStartedPulling="2026-02-17 09:10:52.225601557 +0000 UTC m=+329.768857223" lastFinishedPulling="2026-02-17 09:10:53.724721978 +0000 UTC m=+331.267977624" observedRunningTime="2026-02-17 09:10:54.267133331 +0000 UTC m=+331.810388977" watchObservedRunningTime="2026-02-17 09:10:54.27355888 +0000 UTC m=+331.816814526" Feb 17 09:10:54 crc kubenswrapper[4848]: I0217 09:10:54.421919 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-thxdc"] Feb 17 09:10:54 crc kubenswrapper[4848]: W0217 09:10:54.434327 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42e5e859_f703_486e_9d9b_06660c9d3e51.slice/crio-9908086d1a40dfee52c0ba2edfb76e857fd4e191cf8779287ba08f0dc019b90b WatchSource:0}: Error finding container 9908086d1a40dfee52c0ba2edfb76e857fd4e191cf8779287ba08f0dc019b90b: Status 404 returned error can't find the container with id 9908086d1a40dfee52c0ba2edfb76e857fd4e191cf8779287ba08f0dc019b90b Feb 17 09:10:55 crc kubenswrapper[4848]: I0217 09:10:55.255283 4848 generic.go:334] "Generic (PLEG): container finished" podID="42e5e859-f703-486e-9d9b-06660c9d3e51" containerID="3b213189301836b4dc904d3ae08ecf332c5767060293bac1c3062c32f9169bd6" exitCode=0 Feb 17 09:10:55 crc kubenswrapper[4848]: I0217 09:10:55.255365 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thxdc" event={"ID":"42e5e859-f703-486e-9d9b-06660c9d3e51","Type":"ContainerDied","Data":"3b213189301836b4dc904d3ae08ecf332c5767060293bac1c3062c32f9169bd6"} Feb 17 09:10:55 crc kubenswrapper[4848]: I0217 09:10:55.255643 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thxdc" event={"ID":"42e5e859-f703-486e-9d9b-06660c9d3e51","Type":"ContainerStarted","Data":"9908086d1a40dfee52c0ba2edfb76e857fd4e191cf8779287ba08f0dc019b90b"} Feb 17 09:10:55 crc kubenswrapper[4848]: I0217 09:10:55.258077 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4ccz" event={"ID":"f36cf8f9-f4fd-456f-80ce-c4d68f71273b","Type":"ContainerStarted","Data":"0e658471c0d78698273ff5f5e031edb9c82bbc6a3745e9f8a3cadc7b3c5fe915"} Feb 17 09:10:55 crc kubenswrapper[4848]: I0217 09:10:55.261374 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-m54wv" event={"ID":"31b2644d-739b-4457-a3cc-c30d6b116423","Type":"ContainerStarted","Data":"4295d7d3be2e0e3b67e23451e62ef9778da1a827c29c70fce2a47b1e0f3702e5"} Feb 17 09:10:55 crc kubenswrapper[4848]: I0217 09:10:55.318072 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-m54wv" podStartSLOduration=2.888970547 podStartE2EDuration="5.318054209s" podCreationTimestamp="2026-02-17 09:10:50 +0000 UTC" firstStartedPulling="2026-02-17 09:10:52.228159136 +0000 UTC m=+329.771414792" lastFinishedPulling="2026-02-17 09:10:54.657242798 +0000 UTC m=+332.200498454" observedRunningTime="2026-02-17 09:10:55.315942984 +0000 UTC m=+332.859198640" watchObservedRunningTime="2026-02-17 09:10:55.318054209 +0000 UTC m=+332.861309855" Feb 17 09:10:56 crc kubenswrapper[4848]: I0217 09:10:56.271648 4848 generic.go:334] "Generic (PLEG): container finished" podID="f36cf8f9-f4fd-456f-80ce-c4d68f71273b" containerID="0e658471c0d78698273ff5f5e031edb9c82bbc6a3745e9f8a3cadc7b3c5fe915" exitCode=0 Feb 17 09:10:56 crc kubenswrapper[4848]: I0217 09:10:56.271954 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4ccz" event={"ID":"f36cf8f9-f4fd-456f-80ce-c4d68f71273b","Type":"ContainerDied","Data":"0e658471c0d78698273ff5f5e031edb9c82bbc6a3745e9f8a3cadc7b3c5fe915"} Feb 17 09:10:56 crc kubenswrapper[4848]: I0217 09:10:56.281386 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thxdc" event={"ID":"42e5e859-f703-486e-9d9b-06660c9d3e51","Type":"ContainerStarted","Data":"4a485a892bfe3b36ce736dff6d07455fac96be73721fa33acc69c39a29147d17"} Feb 17 09:10:57 crc kubenswrapper[4848]: I0217 09:10:57.286388 4848 generic.go:334] "Generic (PLEG): container finished" podID="42e5e859-f703-486e-9d9b-06660c9d3e51" containerID="4a485a892bfe3b36ce736dff6d07455fac96be73721fa33acc69c39a29147d17" exitCode=0 Feb 17 09:10:57 crc kubenswrapper[4848]: I0217 09:10:57.286423 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thxdc" event={"ID":"42e5e859-f703-486e-9d9b-06660c9d3e51","Type":"ContainerDied","Data":"4a485a892bfe3b36ce736dff6d07455fac96be73721fa33acc69c39a29147d17"} Feb 17 09:10:57 crc kubenswrapper[4848]: I0217 09:10:57.286740 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-thxdc" event={"ID":"42e5e859-f703-486e-9d9b-06660c9d3e51","Type":"ContainerStarted","Data":"b058631478a3c02c921adef80d49cf18ebb97fd401064e716ea6ba2abeceaa0e"} Feb 17 09:10:57 crc kubenswrapper[4848]: I0217 09:10:57.289747 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-d4ccz" event={"ID":"f36cf8f9-f4fd-456f-80ce-c4d68f71273b","Type":"ContainerStarted","Data":"6a6ef5b4922dfb66dd0274eae4940b00eb5b6d961b7c91311468982094565e2a"} Feb 17 09:10:57 crc kubenswrapper[4848]: I0217 09:10:57.342618 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-thxdc" podStartSLOduration=2.8761027070000003 podStartE2EDuration="4.342602409s" podCreationTimestamp="2026-02-17 09:10:53 +0000 UTC" firstStartedPulling="2026-02-17 09:10:55.256976872 +0000 UTC m=+332.800232518" lastFinishedPulling="2026-02-17 09:10:56.723476554 +0000 UTC m=+334.266732220" observedRunningTime="2026-02-17 09:10:57.322495767 +0000 UTC m=+334.865751413" watchObservedRunningTime="2026-02-17 09:10:57.342602409 +0000 UTC m=+334.885858055" Feb 17 09:11:00 crc kubenswrapper[4848]: I0217 09:11:00.863222 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-m54wv" Feb 17 09:11:00 crc kubenswrapper[4848]: I0217 09:11:00.863615 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-m54wv" Feb 17 09:11:01 crc kubenswrapper[4848]: I0217 09:11:01.046362 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vnl45" Feb 17 09:11:01 crc kubenswrapper[4848]: I0217 09:11:01.046549 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vnl45" Feb 17 09:11:01 crc kubenswrapper[4848]: I0217 09:11:01.127906 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vnl45" Feb 17 09:11:01 crc kubenswrapper[4848]: I0217 09:11:01.162175 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-d4ccz" podStartSLOduration=6.748725456 podStartE2EDuration="9.162139312s" podCreationTimestamp="2026-02-17 09:10:52 +0000 UTC" firstStartedPulling="2026-02-17 09:10:54.248488015 +0000 UTC m=+331.791743661" lastFinishedPulling="2026-02-17 09:10:56.661901851 +0000 UTC m=+334.205157517" observedRunningTime="2026-02-17 09:10:57.343069443 +0000 UTC m=+334.886325089" watchObservedRunningTime="2026-02-17 09:11:01.162139312 +0000 UTC m=+338.705394988" Feb 17 09:11:01 crc kubenswrapper[4848]: I0217 09:11:01.379710 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vnl45" Feb 17 09:11:01 crc kubenswrapper[4848]: I0217 09:11:01.926162 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-m54wv" podUID="31b2644d-739b-4457-a3cc-c30d6b116423" containerName="registry-server" probeResult="failure" output=< Feb 17 09:11:01 crc kubenswrapper[4848]: timeout: failed to connect service ":50051" within 1s Feb 17 09:11:01 crc kubenswrapper[4848]: > Feb 17 09:11:02 crc kubenswrapper[4848]: I0217 09:11:02.846875 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-d4ccz" Feb 17 09:11:02 crc kubenswrapper[4848]: I0217 09:11:02.846953 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-d4ccz" Feb 17 09:11:02 crc kubenswrapper[4848]: I0217 09:11:02.918327 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-d4ccz" Feb 17 09:11:03 crc kubenswrapper[4848]: I0217 09:11:03.379882 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-d4ccz" Feb 17 09:11:03 crc kubenswrapper[4848]: I0217 09:11:03.971731 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-thxdc" Feb 17 09:11:03 crc kubenswrapper[4848]: I0217 09:11:03.972210 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-thxdc" Feb 17 09:11:04 crc kubenswrapper[4848]: I0217 09:11:04.035445 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-thxdc" Feb 17 09:11:04 crc kubenswrapper[4848]: I0217 09:11:04.394851 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-thxdc" Feb 17 09:11:10 crc kubenswrapper[4848]: I0217 09:11:10.925865 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-m54wv" Feb 17 09:11:10 crc kubenswrapper[4848]: I0217 09:11:10.986105 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-m54wv" Feb 17 09:11:18 crc kubenswrapper[4848]: I0217 09:11:18.771118 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:11:18 crc kubenswrapper[4848]: I0217 09:11:18.771416 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.032482 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-nrrj6"] Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.033599 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.048400 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-nrrj6"] Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.196321 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4403a163-197c-439b-8af6-8a9ba390a43d-registry-certificates\") pod \"image-registry-66df7c8f76-nrrj6\" (UID: \"4403a163-197c-439b-8af6-8a9ba390a43d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.196393 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4403a163-197c-439b-8af6-8a9ba390a43d-bound-sa-token\") pod \"image-registry-66df7c8f76-nrrj6\" (UID: \"4403a163-197c-439b-8af6-8a9ba390a43d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.196414 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4403a163-197c-439b-8af6-8a9ba390a43d-registry-tls\") pod \"image-registry-66df7c8f76-nrrj6\" (UID: \"4403a163-197c-439b-8af6-8a9ba390a43d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.196450 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-nrrj6\" (UID: \"4403a163-197c-439b-8af6-8a9ba390a43d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.196478 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4hwv\" (UniqueName: \"kubernetes.io/projected/4403a163-197c-439b-8af6-8a9ba390a43d-kube-api-access-v4hwv\") pod \"image-registry-66df7c8f76-nrrj6\" (UID: \"4403a163-197c-439b-8af6-8a9ba390a43d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.196498 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4403a163-197c-439b-8af6-8a9ba390a43d-trusted-ca\") pod \"image-registry-66df7c8f76-nrrj6\" (UID: \"4403a163-197c-439b-8af6-8a9ba390a43d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.196518 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4403a163-197c-439b-8af6-8a9ba390a43d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-nrrj6\" (UID: \"4403a163-197c-439b-8af6-8a9ba390a43d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.196539 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4403a163-197c-439b-8af6-8a9ba390a43d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-nrrj6\" (UID: \"4403a163-197c-439b-8af6-8a9ba390a43d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.225369 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-nrrj6\" (UID: \"4403a163-197c-439b-8af6-8a9ba390a43d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.297459 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4403a163-197c-439b-8af6-8a9ba390a43d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-nrrj6\" (UID: \"4403a163-197c-439b-8af6-8a9ba390a43d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.297828 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4403a163-197c-439b-8af6-8a9ba390a43d-registry-certificates\") pod \"image-registry-66df7c8f76-nrrj6\" (UID: \"4403a163-197c-439b-8af6-8a9ba390a43d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.297861 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4403a163-197c-439b-8af6-8a9ba390a43d-bound-sa-token\") pod \"image-registry-66df7c8f76-nrrj6\" (UID: \"4403a163-197c-439b-8af6-8a9ba390a43d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.297881 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4403a163-197c-439b-8af6-8a9ba390a43d-registry-tls\") pod \"image-registry-66df7c8f76-nrrj6\" (UID: \"4403a163-197c-439b-8af6-8a9ba390a43d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.297918 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4hwv\" (UniqueName: \"kubernetes.io/projected/4403a163-197c-439b-8af6-8a9ba390a43d-kube-api-access-v4hwv\") pod \"image-registry-66df7c8f76-nrrj6\" (UID: \"4403a163-197c-439b-8af6-8a9ba390a43d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.297935 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4403a163-197c-439b-8af6-8a9ba390a43d-trusted-ca\") pod \"image-registry-66df7c8f76-nrrj6\" (UID: \"4403a163-197c-439b-8af6-8a9ba390a43d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.297955 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4403a163-197c-439b-8af6-8a9ba390a43d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-nrrj6\" (UID: \"4403a163-197c-439b-8af6-8a9ba390a43d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.298070 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4403a163-197c-439b-8af6-8a9ba390a43d-ca-trust-extracted\") pod \"image-registry-66df7c8f76-nrrj6\" (UID: \"4403a163-197c-439b-8af6-8a9ba390a43d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.299775 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4403a163-197c-439b-8af6-8a9ba390a43d-trusted-ca\") pod \"image-registry-66df7c8f76-nrrj6\" (UID: \"4403a163-197c-439b-8af6-8a9ba390a43d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.300061 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4403a163-197c-439b-8af6-8a9ba390a43d-registry-certificates\") pod \"image-registry-66df7c8f76-nrrj6\" (UID: \"4403a163-197c-439b-8af6-8a9ba390a43d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.303309 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4403a163-197c-439b-8af6-8a9ba390a43d-installation-pull-secrets\") pod \"image-registry-66df7c8f76-nrrj6\" (UID: \"4403a163-197c-439b-8af6-8a9ba390a43d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.312318 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4403a163-197c-439b-8af6-8a9ba390a43d-registry-tls\") pod \"image-registry-66df7c8f76-nrrj6\" (UID: \"4403a163-197c-439b-8af6-8a9ba390a43d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.320612 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4403a163-197c-439b-8af6-8a9ba390a43d-bound-sa-token\") pod \"image-registry-66df7c8f76-nrrj6\" (UID: \"4403a163-197c-439b-8af6-8a9ba390a43d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.326144 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4hwv\" (UniqueName: \"kubernetes.io/projected/4403a163-197c-439b-8af6-8a9ba390a43d-kube-api-access-v4hwv\") pod \"image-registry-66df7c8f76-nrrj6\" (UID: \"4403a163-197c-439b-8af6-8a9ba390a43d\") " pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.353914 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.551637 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-nrrj6"] Feb 17 09:11:21 crc kubenswrapper[4848]: I0217 09:11:21.672183 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" event={"ID":"4403a163-197c-439b-8af6-8a9ba390a43d","Type":"ContainerStarted","Data":"ce1add0ef6367799d5092fa622e8f082d7049ba5cf40aa609444372d0d60a354"} Feb 17 09:11:22 crc kubenswrapper[4848]: I0217 09:11:22.679659 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" event={"ID":"4403a163-197c-439b-8af6-8a9ba390a43d","Type":"ContainerStarted","Data":"606ce250c90aa8a52b15e4f0dfccf9c8b740787f8dc0b22d8660f32dd2d9d002"} Feb 17 09:11:22 crc kubenswrapper[4848]: I0217 09:11:22.679872 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:22 crc kubenswrapper[4848]: I0217 09:11:22.707081 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" podStartSLOduration=1.707061123 podStartE2EDuration="1.707061123s" podCreationTimestamp="2026-02-17 09:11:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:11:22.703046919 +0000 UTC m=+360.246302605" watchObservedRunningTime="2026-02-17 09:11:22.707061123 +0000 UTC m=+360.250316809" Feb 17 09:11:41 crc kubenswrapper[4848]: I0217 09:11:41.365403 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-nrrj6" Feb 17 09:11:41 crc kubenswrapper[4848]: I0217 09:11:41.475120 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4hfc2"] Feb 17 09:11:48 crc kubenswrapper[4848]: I0217 09:11:48.771739 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:11:48 crc kubenswrapper[4848]: I0217 09:11:48.772889 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:12:06 crc kubenswrapper[4848]: I0217 09:12:06.531198 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" podUID="e29aabb7-fe3b-4887-a00d-535144b46d4b" containerName="registry" containerID="cri-o://fcaddb0cdd07881363d2945ced691542cb8597560fe22f405e6114d33e3a4525" gracePeriod=30 Feb 17 09:12:06 crc kubenswrapper[4848]: I0217 09:12:06.938203 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:12:06 crc kubenswrapper[4848]: I0217 09:12:06.987946 4848 generic.go:334] "Generic (PLEG): container finished" podID="e29aabb7-fe3b-4887-a00d-535144b46d4b" containerID="fcaddb0cdd07881363d2945ced691542cb8597560fe22f405e6114d33e3a4525" exitCode=0 Feb 17 09:12:06 crc kubenswrapper[4848]: I0217 09:12:06.987997 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" event={"ID":"e29aabb7-fe3b-4887-a00d-535144b46d4b","Type":"ContainerDied","Data":"fcaddb0cdd07881363d2945ced691542cb8597560fe22f405e6114d33e3a4525"} Feb 17 09:12:06 crc kubenswrapper[4848]: I0217 09:12:06.988039 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" event={"ID":"e29aabb7-fe3b-4887-a00d-535144b46d4b","Type":"ContainerDied","Data":"10e8c25c7a42a2caab60e4ea48c938ead49a84cd212a67475f093407457d936b"} Feb 17 09:12:06 crc kubenswrapper[4848]: I0217 09:12:06.988060 4848 scope.go:117] "RemoveContainer" containerID="fcaddb0cdd07881363d2945ced691542cb8597560fe22f405e6114d33e3a4525" Feb 17 09:12:06 crc kubenswrapper[4848]: I0217 09:12:06.988052 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-4hfc2" Feb 17 09:12:06 crc kubenswrapper[4848]: I0217 09:12:06.998298 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nhsj\" (UniqueName: \"kubernetes.io/projected/e29aabb7-fe3b-4887-a00d-535144b46d4b-kube-api-access-2nhsj\") pod \"e29aabb7-fe3b-4887-a00d-535144b46d4b\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " Feb 17 09:12:06 crc kubenswrapper[4848]: I0217 09:12:06.998344 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e29aabb7-fe3b-4887-a00d-535144b46d4b-registry-tls\") pod \"e29aabb7-fe3b-4887-a00d-535144b46d4b\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " Feb 17 09:12:06 crc kubenswrapper[4848]: I0217 09:12:06.998369 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e29aabb7-fe3b-4887-a00d-535144b46d4b-trusted-ca\") pod \"e29aabb7-fe3b-4887-a00d-535144b46d4b\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " Feb 17 09:12:06 crc kubenswrapper[4848]: I0217 09:12:06.998413 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e29aabb7-fe3b-4887-a00d-535144b46d4b-bound-sa-token\") pod \"e29aabb7-fe3b-4887-a00d-535144b46d4b\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " Feb 17 09:12:06 crc kubenswrapper[4848]: I0217 09:12:06.998581 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"e29aabb7-fe3b-4887-a00d-535144b46d4b\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " Feb 17 09:12:06 crc kubenswrapper[4848]: I0217 09:12:06.998629 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e29aabb7-fe3b-4887-a00d-535144b46d4b-installation-pull-secrets\") pod \"e29aabb7-fe3b-4887-a00d-535144b46d4b\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " Feb 17 09:12:06 crc kubenswrapper[4848]: I0217 09:12:06.998652 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e29aabb7-fe3b-4887-a00d-535144b46d4b-ca-trust-extracted\") pod \"e29aabb7-fe3b-4887-a00d-535144b46d4b\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " Feb 17 09:12:06 crc kubenswrapper[4848]: I0217 09:12:06.998685 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e29aabb7-fe3b-4887-a00d-535144b46d4b-registry-certificates\") pod \"e29aabb7-fe3b-4887-a00d-535144b46d4b\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " Feb 17 09:12:06 crc kubenswrapper[4848]: I0217 09:12:06.999444 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e29aabb7-fe3b-4887-a00d-535144b46d4b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "e29aabb7-fe3b-4887-a00d-535144b46d4b" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:12:07 crc kubenswrapper[4848]: I0217 09:12:07.001273 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e29aabb7-fe3b-4887-a00d-535144b46d4b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "e29aabb7-fe3b-4887-a00d-535144b46d4b" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:12:07 crc kubenswrapper[4848]: I0217 09:12:07.006866 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e29aabb7-fe3b-4887-a00d-535144b46d4b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "e29aabb7-fe3b-4887-a00d-535144b46d4b" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:12:07 crc kubenswrapper[4848]: I0217 09:12:07.010374 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e29aabb7-fe3b-4887-a00d-535144b46d4b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "e29aabb7-fe3b-4887-a00d-535144b46d4b" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:12:07 crc kubenswrapper[4848]: E0217 09:12:07.010684 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:e29aabb7-fe3b-4887-a00d-535144b46d4b nodeName:}" failed. No retries permitted until 2026-02-17 09:12:07.510617497 +0000 UTC m=+405.053873163 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "registry-storage" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "e29aabb7-fe3b-4887-a00d-535144b46d4b" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b") : kubernetes.io/csi: Unmounter.TearDownAt failed: rpc error: code = Unknown desc = check target path: could not get consistent content of /proc/mounts after 3 attempts Feb 17 09:12:07 crc kubenswrapper[4848]: I0217 09:12:07.010931 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e29aabb7-fe3b-4887-a00d-535144b46d4b-kube-api-access-2nhsj" (OuterVolumeSpecName: "kube-api-access-2nhsj") pod "e29aabb7-fe3b-4887-a00d-535144b46d4b" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b"). InnerVolumeSpecName "kube-api-access-2nhsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:12:07 crc kubenswrapper[4848]: I0217 09:12:07.011320 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e29aabb7-fe3b-4887-a00d-535144b46d4b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "e29aabb7-fe3b-4887-a00d-535144b46d4b" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:12:07 crc kubenswrapper[4848]: I0217 09:12:07.014391 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e29aabb7-fe3b-4887-a00d-535144b46d4b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "e29aabb7-fe3b-4887-a00d-535144b46d4b" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:12:07 crc kubenswrapper[4848]: I0217 09:12:07.015293 4848 scope.go:117] "RemoveContainer" containerID="fcaddb0cdd07881363d2945ced691542cb8597560fe22f405e6114d33e3a4525" Feb 17 09:12:07 crc kubenswrapper[4848]: E0217 09:12:07.016119 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcaddb0cdd07881363d2945ced691542cb8597560fe22f405e6114d33e3a4525\": container with ID starting with fcaddb0cdd07881363d2945ced691542cb8597560fe22f405e6114d33e3a4525 not found: ID does not exist" containerID="fcaddb0cdd07881363d2945ced691542cb8597560fe22f405e6114d33e3a4525" Feb 17 09:12:07 crc kubenswrapper[4848]: I0217 09:12:07.016169 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcaddb0cdd07881363d2945ced691542cb8597560fe22f405e6114d33e3a4525"} err="failed to get container status \"fcaddb0cdd07881363d2945ced691542cb8597560fe22f405e6114d33e3a4525\": rpc error: code = NotFound desc = could not find container \"fcaddb0cdd07881363d2945ced691542cb8597560fe22f405e6114d33e3a4525\": container with ID starting with fcaddb0cdd07881363d2945ced691542cb8597560fe22f405e6114d33e3a4525 not found: ID does not exist" Feb 17 09:12:07 crc kubenswrapper[4848]: I0217 09:12:07.101254 4848 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e29aabb7-fe3b-4887-a00d-535144b46d4b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 17 09:12:07 crc kubenswrapper[4848]: I0217 09:12:07.101316 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nhsj\" (UniqueName: \"kubernetes.io/projected/e29aabb7-fe3b-4887-a00d-535144b46d4b-kube-api-access-2nhsj\") on node \"crc\" DevicePath \"\"" Feb 17 09:12:07 crc kubenswrapper[4848]: I0217 09:12:07.101343 4848 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e29aabb7-fe3b-4887-a00d-535144b46d4b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 17 09:12:07 crc kubenswrapper[4848]: I0217 09:12:07.101370 4848 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e29aabb7-fe3b-4887-a00d-535144b46d4b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:12:07 crc kubenswrapper[4848]: I0217 09:12:07.101393 4848 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e29aabb7-fe3b-4887-a00d-535144b46d4b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 17 09:12:07 crc kubenswrapper[4848]: I0217 09:12:07.101416 4848 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e29aabb7-fe3b-4887-a00d-535144b46d4b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 17 09:12:07 crc kubenswrapper[4848]: I0217 09:12:07.101436 4848 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e29aabb7-fe3b-4887-a00d-535144b46d4b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 17 09:12:07 crc kubenswrapper[4848]: I0217 09:12:07.608067 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"e29aabb7-fe3b-4887-a00d-535144b46d4b\" (UID: \"e29aabb7-fe3b-4887-a00d-535144b46d4b\") " Feb 17 09:12:07 crc kubenswrapper[4848]: I0217 09:12:07.620943 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "e29aabb7-fe3b-4887-a00d-535144b46d4b" (UID: "e29aabb7-fe3b-4887-a00d-535144b46d4b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 17 09:12:07 crc kubenswrapper[4848]: I0217 09:12:07.736216 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4hfc2"] Feb 17 09:12:07 crc kubenswrapper[4848]: I0217 09:12:07.743092 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-4hfc2"] Feb 17 09:12:09 crc kubenswrapper[4848]: I0217 09:12:09.391378 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e29aabb7-fe3b-4887-a00d-535144b46d4b" path="/var/lib/kubelet/pods/e29aabb7-fe3b-4887-a00d-535144b46d4b/volumes" Feb 17 09:12:18 crc kubenswrapper[4848]: I0217 09:12:18.771813 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:12:18 crc kubenswrapper[4848]: I0217 09:12:18.772163 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:12:18 crc kubenswrapper[4848]: I0217 09:12:18.772220 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:12:18 crc kubenswrapper[4848]: I0217 09:12:18.772773 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c10c26abd2aa6ffa5969a35af04b1b8e427c06a994c5d6c12ac096d788cf26a4"} pod="openshift-machine-config-operator/machine-config-daemon-stvnz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 09:12:18 crc kubenswrapper[4848]: I0217 09:12:18.772849 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" containerID="cri-o://c10c26abd2aa6ffa5969a35af04b1b8e427c06a994c5d6c12ac096d788cf26a4" gracePeriod=600 Feb 17 09:12:19 crc kubenswrapper[4848]: I0217 09:12:19.077375 4848 generic.go:334] "Generic (PLEG): container finished" podID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerID="c10c26abd2aa6ffa5969a35af04b1b8e427c06a994c5d6c12ac096d788cf26a4" exitCode=0 Feb 17 09:12:19 crc kubenswrapper[4848]: I0217 09:12:19.077487 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerDied","Data":"c10c26abd2aa6ffa5969a35af04b1b8e427c06a994c5d6c12ac096d788cf26a4"} Feb 17 09:12:19 crc kubenswrapper[4848]: I0217 09:12:19.077832 4848 scope.go:117] "RemoveContainer" containerID="d51f571207f05faaff16e994831a9ad31574ee24575d4391111badbfb232a3c8" Feb 17 09:12:20 crc kubenswrapper[4848]: I0217 09:12:20.088925 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerStarted","Data":"22658753e35aaed607fe5a9bec19a88c03d7cd46abf15a9cf79a3c5b734a8c5c"} Feb 17 09:14:48 crc kubenswrapper[4848]: I0217 09:14:48.772099 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:14:48 crc kubenswrapper[4848]: I0217 09:14:48.772945 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:15:00 crc kubenswrapper[4848]: I0217 09:15:00.164728 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521995-jzx88"] Feb 17 09:15:00 crc kubenswrapper[4848]: E0217 09:15:00.165515 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e29aabb7-fe3b-4887-a00d-535144b46d4b" containerName="registry" Feb 17 09:15:00 crc kubenswrapper[4848]: I0217 09:15:00.165531 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="e29aabb7-fe3b-4887-a00d-535144b46d4b" containerName="registry" Feb 17 09:15:00 crc kubenswrapper[4848]: I0217 09:15:00.165654 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="e29aabb7-fe3b-4887-a00d-535144b46d4b" containerName="registry" Feb 17 09:15:00 crc kubenswrapper[4848]: I0217 09:15:00.166183 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521995-jzx88" Feb 17 09:15:00 crc kubenswrapper[4848]: I0217 09:15:00.169074 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 09:15:00 crc kubenswrapper[4848]: I0217 09:15:00.169272 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 09:15:00 crc kubenswrapper[4848]: I0217 09:15:00.177711 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521995-jzx88"] Feb 17 09:15:00 crc kubenswrapper[4848]: I0217 09:15:00.265426 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fw96\" (UniqueName: \"kubernetes.io/projected/98bce666-a802-42f1-9b9e-366d88c049ba-kube-api-access-8fw96\") pod \"collect-profiles-29521995-jzx88\" (UID: \"98bce666-a802-42f1-9b9e-366d88c049ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521995-jzx88" Feb 17 09:15:00 crc kubenswrapper[4848]: I0217 09:15:00.265483 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98bce666-a802-42f1-9b9e-366d88c049ba-secret-volume\") pod \"collect-profiles-29521995-jzx88\" (UID: \"98bce666-a802-42f1-9b9e-366d88c049ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521995-jzx88" Feb 17 09:15:00 crc kubenswrapper[4848]: I0217 09:15:00.265506 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98bce666-a802-42f1-9b9e-366d88c049ba-config-volume\") pod \"collect-profiles-29521995-jzx88\" (UID: \"98bce666-a802-42f1-9b9e-366d88c049ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521995-jzx88" Feb 17 09:15:00 crc kubenswrapper[4848]: I0217 09:15:00.366665 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98bce666-a802-42f1-9b9e-366d88c049ba-secret-volume\") pod \"collect-profiles-29521995-jzx88\" (UID: \"98bce666-a802-42f1-9b9e-366d88c049ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521995-jzx88" Feb 17 09:15:00 crc kubenswrapper[4848]: I0217 09:15:00.366740 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98bce666-a802-42f1-9b9e-366d88c049ba-config-volume\") pod \"collect-profiles-29521995-jzx88\" (UID: \"98bce666-a802-42f1-9b9e-366d88c049ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521995-jzx88" Feb 17 09:15:00 crc kubenswrapper[4848]: I0217 09:15:00.366878 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fw96\" (UniqueName: \"kubernetes.io/projected/98bce666-a802-42f1-9b9e-366d88c049ba-kube-api-access-8fw96\") pod \"collect-profiles-29521995-jzx88\" (UID: \"98bce666-a802-42f1-9b9e-366d88c049ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521995-jzx88" Feb 17 09:15:00 crc kubenswrapper[4848]: I0217 09:15:00.368051 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98bce666-a802-42f1-9b9e-366d88c049ba-config-volume\") pod \"collect-profiles-29521995-jzx88\" (UID: \"98bce666-a802-42f1-9b9e-366d88c049ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521995-jzx88" Feb 17 09:15:00 crc kubenswrapper[4848]: I0217 09:15:00.378535 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98bce666-a802-42f1-9b9e-366d88c049ba-secret-volume\") pod \"collect-profiles-29521995-jzx88\" (UID: \"98bce666-a802-42f1-9b9e-366d88c049ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521995-jzx88" Feb 17 09:15:00 crc kubenswrapper[4848]: I0217 09:15:00.387148 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fw96\" (UniqueName: \"kubernetes.io/projected/98bce666-a802-42f1-9b9e-366d88c049ba-kube-api-access-8fw96\") pod \"collect-profiles-29521995-jzx88\" (UID: \"98bce666-a802-42f1-9b9e-366d88c049ba\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29521995-jzx88" Feb 17 09:15:00 crc kubenswrapper[4848]: I0217 09:15:00.491415 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521995-jzx88" Feb 17 09:15:00 crc kubenswrapper[4848]: I0217 09:15:00.715895 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521995-jzx88"] Feb 17 09:15:01 crc kubenswrapper[4848]: I0217 09:15:01.557780 4848 generic.go:334] "Generic (PLEG): container finished" podID="98bce666-a802-42f1-9b9e-366d88c049ba" containerID="a5e852f666ccc30caf2695740c63d0c07c0a4db89fe690eca1fe310e104f6011" exitCode=0 Feb 17 09:15:01 crc kubenswrapper[4848]: I0217 09:15:01.557893 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521995-jzx88" event={"ID":"98bce666-a802-42f1-9b9e-366d88c049ba","Type":"ContainerDied","Data":"a5e852f666ccc30caf2695740c63d0c07c0a4db89fe690eca1fe310e104f6011"} Feb 17 09:15:01 crc kubenswrapper[4848]: I0217 09:15:01.558049 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521995-jzx88" event={"ID":"98bce666-a802-42f1-9b9e-366d88c049ba","Type":"ContainerStarted","Data":"b92399ad85113363835e91e3493f2789f7846b95d110a1629850445e36cd5e72"} Feb 17 09:15:02 crc kubenswrapper[4848]: I0217 09:15:02.849179 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521995-jzx88" Feb 17 09:15:02 crc kubenswrapper[4848]: I0217 09:15:02.902626 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fw96\" (UniqueName: \"kubernetes.io/projected/98bce666-a802-42f1-9b9e-366d88c049ba-kube-api-access-8fw96\") pod \"98bce666-a802-42f1-9b9e-366d88c049ba\" (UID: \"98bce666-a802-42f1-9b9e-366d88c049ba\") " Feb 17 09:15:02 crc kubenswrapper[4848]: I0217 09:15:02.902731 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98bce666-a802-42f1-9b9e-366d88c049ba-config-volume\") pod \"98bce666-a802-42f1-9b9e-366d88c049ba\" (UID: \"98bce666-a802-42f1-9b9e-366d88c049ba\") " Feb 17 09:15:02 crc kubenswrapper[4848]: I0217 09:15:02.902780 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98bce666-a802-42f1-9b9e-366d88c049ba-secret-volume\") pod \"98bce666-a802-42f1-9b9e-366d88c049ba\" (UID: \"98bce666-a802-42f1-9b9e-366d88c049ba\") " Feb 17 09:15:02 crc kubenswrapper[4848]: I0217 09:15:02.904320 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98bce666-a802-42f1-9b9e-366d88c049ba-config-volume" (OuterVolumeSpecName: "config-volume") pod "98bce666-a802-42f1-9b9e-366d88c049ba" (UID: "98bce666-a802-42f1-9b9e-366d88c049ba"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:15:02 crc kubenswrapper[4848]: I0217 09:15:02.911666 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98bce666-a802-42f1-9b9e-366d88c049ba-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "98bce666-a802-42f1-9b9e-366d88c049ba" (UID: "98bce666-a802-42f1-9b9e-366d88c049ba"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:15:02 crc kubenswrapper[4848]: I0217 09:15:02.913070 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98bce666-a802-42f1-9b9e-366d88c049ba-kube-api-access-8fw96" (OuterVolumeSpecName: "kube-api-access-8fw96") pod "98bce666-a802-42f1-9b9e-366d88c049ba" (UID: "98bce666-a802-42f1-9b9e-366d88c049ba"). InnerVolumeSpecName "kube-api-access-8fw96". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:15:03 crc kubenswrapper[4848]: I0217 09:15:03.004528 4848 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/98bce666-a802-42f1-9b9e-366d88c049ba-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:03 crc kubenswrapper[4848]: I0217 09:15:03.004575 4848 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/98bce666-a802-42f1-9b9e-366d88c049ba-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:03 crc kubenswrapper[4848]: I0217 09:15:03.004594 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fw96\" (UniqueName: \"kubernetes.io/projected/98bce666-a802-42f1-9b9e-366d88c049ba-kube-api-access-8fw96\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:03 crc kubenswrapper[4848]: I0217 09:15:03.572035 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29521995-jzx88" event={"ID":"98bce666-a802-42f1-9b9e-366d88c049ba","Type":"ContainerDied","Data":"b92399ad85113363835e91e3493f2789f7846b95d110a1629850445e36cd5e72"} Feb 17 09:15:03 crc kubenswrapper[4848]: I0217 09:15:03.572067 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b92399ad85113363835e91e3493f2789f7846b95d110a1629850445e36cd5e72" Feb 17 09:15:03 crc kubenswrapper[4848]: I0217 09:15:03.572088 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29521995-jzx88" Feb 17 09:15:03 crc kubenswrapper[4848]: I0217 09:15:03.971147 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-whbhm"] Feb 17 09:15:03 crc kubenswrapper[4848]: E0217 09:15:03.971507 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98bce666-a802-42f1-9b9e-366d88c049ba" containerName="collect-profiles" Feb 17 09:15:03 crc kubenswrapper[4848]: I0217 09:15:03.971529 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="98bce666-a802-42f1-9b9e-366d88c049ba" containerName="collect-profiles" Feb 17 09:15:03 crc kubenswrapper[4848]: I0217 09:15:03.971693 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="98bce666-a802-42f1-9b9e-366d88c049ba" containerName="collect-profiles" Feb 17 09:15:03 crc kubenswrapper[4848]: I0217 09:15:03.972286 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-whbhm" Feb 17 09:15:03 crc kubenswrapper[4848]: I0217 09:15:03.974518 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-8pgtk"] Feb 17 09:15:03 crc kubenswrapper[4848]: I0217 09:15:03.975303 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8pgtk" Feb 17 09:15:03 crc kubenswrapper[4848]: I0217 09:15:03.976484 4848 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-92fqx" Feb 17 09:15:03 crc kubenswrapper[4848]: I0217 09:15:03.976699 4848 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-ws2gp" Feb 17 09:15:03 crc kubenswrapper[4848]: I0217 09:15:03.976891 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 17 09:15:03 crc kubenswrapper[4848]: I0217 09:15:03.977837 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 17 09:15:03 crc kubenswrapper[4848]: I0217 09:15:03.990193 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-crrm4"] Feb 17 09:15:03 crc kubenswrapper[4848]: I0217 09:15:03.991004 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-crrm4" Feb 17 09:15:03 crc kubenswrapper[4848]: I0217 09:15:03.993085 4848 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-tddzq" Feb 17 09:15:04 crc kubenswrapper[4848]: I0217 09:15:04.002928 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-8pgtk"] Feb 17 09:15:04 crc kubenswrapper[4848]: I0217 09:15:04.021073 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9ln5\" (UniqueName: \"kubernetes.io/projected/08f85cef-d7cc-46c2-a1ff-ba22a9b098ab-kube-api-access-v9ln5\") pod \"cert-manager-858654f9db-whbhm\" (UID: \"08f85cef-d7cc-46c2-a1ff-ba22a9b098ab\") " pod="cert-manager/cert-manager-858654f9db-whbhm" Feb 17 09:15:04 crc kubenswrapper[4848]: I0217 09:15:04.021278 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwntb\" (UniqueName: \"kubernetes.io/projected/4851dc21-51d9-4c87-a15c-4b7295155016-kube-api-access-zwntb\") pod \"cert-manager-cainjector-cf98fcc89-8pgtk\" (UID: \"4851dc21-51d9-4c87-a15c-4b7295155016\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-8pgtk" Feb 17 09:15:04 crc kubenswrapper[4848]: I0217 09:15:04.021376 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8rcp\" (UniqueName: \"kubernetes.io/projected/f0048a92-b3fb-4c29-b58f-7013b68e1512-kube-api-access-s8rcp\") pod \"cert-manager-webhook-687f57d79b-crrm4\" (UID: \"f0048a92-b3fb-4c29-b58f-7013b68e1512\") " pod="cert-manager/cert-manager-webhook-687f57d79b-crrm4" Feb 17 09:15:04 crc kubenswrapper[4848]: I0217 09:15:04.024422 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-crrm4"] Feb 17 09:15:04 crc kubenswrapper[4848]: I0217 09:15:04.053609 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-whbhm"] Feb 17 09:15:04 crc kubenswrapper[4848]: I0217 09:15:04.122878 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwntb\" (UniqueName: \"kubernetes.io/projected/4851dc21-51d9-4c87-a15c-4b7295155016-kube-api-access-zwntb\") pod \"cert-manager-cainjector-cf98fcc89-8pgtk\" (UID: \"4851dc21-51d9-4c87-a15c-4b7295155016\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-8pgtk" Feb 17 09:15:04 crc kubenswrapper[4848]: I0217 09:15:04.122945 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8rcp\" (UniqueName: \"kubernetes.io/projected/f0048a92-b3fb-4c29-b58f-7013b68e1512-kube-api-access-s8rcp\") pod \"cert-manager-webhook-687f57d79b-crrm4\" (UID: \"f0048a92-b3fb-4c29-b58f-7013b68e1512\") " pod="cert-manager/cert-manager-webhook-687f57d79b-crrm4" Feb 17 09:15:04 crc kubenswrapper[4848]: I0217 09:15:04.123007 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9ln5\" (UniqueName: \"kubernetes.io/projected/08f85cef-d7cc-46c2-a1ff-ba22a9b098ab-kube-api-access-v9ln5\") pod \"cert-manager-858654f9db-whbhm\" (UID: \"08f85cef-d7cc-46c2-a1ff-ba22a9b098ab\") " pod="cert-manager/cert-manager-858654f9db-whbhm" Feb 17 09:15:04 crc kubenswrapper[4848]: I0217 09:15:04.141907 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwntb\" (UniqueName: \"kubernetes.io/projected/4851dc21-51d9-4c87-a15c-4b7295155016-kube-api-access-zwntb\") pod \"cert-manager-cainjector-cf98fcc89-8pgtk\" (UID: \"4851dc21-51d9-4c87-a15c-4b7295155016\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-8pgtk" Feb 17 09:15:04 crc kubenswrapper[4848]: I0217 09:15:04.144823 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8rcp\" (UniqueName: \"kubernetes.io/projected/f0048a92-b3fb-4c29-b58f-7013b68e1512-kube-api-access-s8rcp\") pod \"cert-manager-webhook-687f57d79b-crrm4\" (UID: \"f0048a92-b3fb-4c29-b58f-7013b68e1512\") " pod="cert-manager/cert-manager-webhook-687f57d79b-crrm4" Feb 17 09:15:04 crc kubenswrapper[4848]: I0217 09:15:04.146185 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9ln5\" (UniqueName: \"kubernetes.io/projected/08f85cef-d7cc-46c2-a1ff-ba22a9b098ab-kube-api-access-v9ln5\") pod \"cert-manager-858654f9db-whbhm\" (UID: \"08f85cef-d7cc-46c2-a1ff-ba22a9b098ab\") " pod="cert-manager/cert-manager-858654f9db-whbhm" Feb 17 09:15:04 crc kubenswrapper[4848]: I0217 09:15:04.300091 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-whbhm" Feb 17 09:15:04 crc kubenswrapper[4848]: I0217 09:15:04.312822 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8pgtk" Feb 17 09:15:04 crc kubenswrapper[4848]: I0217 09:15:04.342359 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-crrm4" Feb 17 09:15:04 crc kubenswrapper[4848]: I0217 09:15:04.804155 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-whbhm"] Feb 17 09:15:04 crc kubenswrapper[4848]: I0217 09:15:04.815674 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-8pgtk"] Feb 17 09:15:04 crc kubenswrapper[4848]: W0217 09:15:04.816920 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08f85cef_d7cc_46c2_a1ff_ba22a9b098ab.slice/crio-f1b28e08303a10e9ab4341cf1a4aef7431e17599652e938d4bbd7b33fd9c50e5 WatchSource:0}: Error finding container f1b28e08303a10e9ab4341cf1a4aef7431e17599652e938d4bbd7b33fd9c50e5: Status 404 returned error can't find the container with id f1b28e08303a10e9ab4341cf1a4aef7431e17599652e938d4bbd7b33fd9c50e5 Feb 17 09:15:04 crc kubenswrapper[4848]: W0217 09:15:04.821966 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4851dc21_51d9_4c87_a15c_4b7295155016.slice/crio-39d2d114c4af0fa5929afb9736a3dced4b264200e91ebdd270132a44afdb950c WatchSource:0}: Error finding container 39d2d114c4af0fa5929afb9736a3dced4b264200e91ebdd270132a44afdb950c: Status 404 returned error can't find the container with id 39d2d114c4af0fa5929afb9736a3dced4b264200e91ebdd270132a44afdb950c Feb 17 09:15:04 crc kubenswrapper[4848]: I0217 09:15:04.823728 4848 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 09:15:04 crc kubenswrapper[4848]: I0217 09:15:04.833108 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-crrm4"] Feb 17 09:15:04 crc kubenswrapper[4848]: W0217 09:15:04.841347 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0048a92_b3fb_4c29_b58f_7013b68e1512.slice/crio-389589737b522bf54916fd8e964ff00e3e65071c99588f401cef438fae4dbb3d WatchSource:0}: Error finding container 389589737b522bf54916fd8e964ff00e3e65071c99588f401cef438fae4dbb3d: Status 404 returned error can't find the container with id 389589737b522bf54916fd8e964ff00e3e65071c99588f401cef438fae4dbb3d Feb 17 09:15:05 crc kubenswrapper[4848]: I0217 09:15:05.584641 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-whbhm" event={"ID":"08f85cef-d7cc-46c2-a1ff-ba22a9b098ab","Type":"ContainerStarted","Data":"f1b28e08303a10e9ab4341cf1a4aef7431e17599652e938d4bbd7b33fd9c50e5"} Feb 17 09:15:05 crc kubenswrapper[4848]: I0217 09:15:05.585777 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-crrm4" event={"ID":"f0048a92-b3fb-4c29-b58f-7013b68e1512","Type":"ContainerStarted","Data":"389589737b522bf54916fd8e964ff00e3e65071c99588f401cef438fae4dbb3d"} Feb 17 09:15:05 crc kubenswrapper[4848]: I0217 09:15:05.586726 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8pgtk" event={"ID":"4851dc21-51d9-4c87-a15c-4b7295155016","Type":"ContainerStarted","Data":"39d2d114c4af0fa5929afb9736a3dced4b264200e91ebdd270132a44afdb950c"} Feb 17 09:15:09 crc kubenswrapper[4848]: I0217 09:15:09.608108 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-crrm4" event={"ID":"f0048a92-b3fb-4c29-b58f-7013b68e1512","Type":"ContainerStarted","Data":"995044fe8b01e40893ed9a4e50ebd69c4a20d473d5d2cb436a967663916e7c09"} Feb 17 09:15:09 crc kubenswrapper[4848]: I0217 09:15:09.609576 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-crrm4" Feb 17 09:15:09 crc kubenswrapper[4848]: I0217 09:15:09.612716 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8pgtk" event={"ID":"4851dc21-51d9-4c87-a15c-4b7295155016","Type":"ContainerStarted","Data":"1dc6fe6f425056716322505c6cd603a64cc0833f1c23fdf75af50c0de9e99128"} Feb 17 09:15:09 crc kubenswrapper[4848]: I0217 09:15:09.616077 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-whbhm" event={"ID":"08f85cef-d7cc-46c2-a1ff-ba22a9b098ab","Type":"ContainerStarted","Data":"d0c7f41c98d3188bca1356f635248dc211e2331ce3e0420986dc08ac4b97f25c"} Feb 17 09:15:09 crc kubenswrapper[4848]: I0217 09:15:09.661382 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-whbhm" podStartSLOduration=2.671737283 podStartE2EDuration="6.661350668s" podCreationTimestamp="2026-02-17 09:15:03 +0000 UTC" firstStartedPulling="2026-02-17 09:15:04.82331886 +0000 UTC m=+582.366574546" lastFinishedPulling="2026-02-17 09:15:08.812932245 +0000 UTC m=+586.356187931" observedRunningTime="2026-02-17 09:15:09.654574093 +0000 UTC m=+587.197829849" watchObservedRunningTime="2026-02-17 09:15:09.661350668 +0000 UTC m=+587.204606394" Feb 17 09:15:09 crc kubenswrapper[4848]: I0217 09:15:09.662957 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-crrm4" podStartSLOduration=2.7985789150000002 podStartE2EDuration="6.662942884s" podCreationTimestamp="2026-02-17 09:15:03 +0000 UTC" firstStartedPulling="2026-02-17 09:15:04.844023724 +0000 UTC m=+582.387279410" lastFinishedPulling="2026-02-17 09:15:08.708387723 +0000 UTC m=+586.251643379" observedRunningTime="2026-02-17 09:15:09.631306175 +0000 UTC m=+587.174561861" watchObservedRunningTime="2026-02-17 09:15:09.662942884 +0000 UTC m=+587.206198580" Feb 17 09:15:09 crc kubenswrapper[4848]: I0217 09:15:09.695202 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-8pgtk" podStartSLOduration=2.797702641 podStartE2EDuration="6.695151139s" podCreationTimestamp="2026-02-17 09:15:03 +0000 UTC" firstStartedPulling="2026-02-17 09:15:04.823831745 +0000 UTC m=+582.367087391" lastFinishedPulling="2026-02-17 09:15:08.721280203 +0000 UTC m=+586.264535889" observedRunningTime="2026-02-17 09:15:09.685659566 +0000 UTC m=+587.228915252" watchObservedRunningTime="2026-02-17 09:15:09.695151139 +0000 UTC m=+587.238406835" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.143093 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4fvgf"] Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.144128 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="ovn-controller" containerID="cri-o://37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7" gracePeriod=30 Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.144182 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="nbdb" containerID="cri-o://0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3" gracePeriod=30 Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.144241 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b" gracePeriod=30 Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.144256 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="kube-rbac-proxy-node" containerID="cri-o://79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4" gracePeriod=30 Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.144345 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="ovn-acl-logging" containerID="cri-o://8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b" gracePeriod=30 Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.144471 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="northd" containerID="cri-o://cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6" gracePeriod=30 Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.144624 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="sbdb" containerID="cri-o://d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5" gracePeriod=30 Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.191573 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="ovnkube-controller" containerID="cri-o://50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b" gracePeriod=30 Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.346111 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-crrm4" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.474493 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4fvgf_2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8/ovnkube-controller/3.log" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.477042 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4fvgf_2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8/ovn-acl-logging/0.log" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.477682 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4fvgf_2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8/ovn-controller/0.log" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.478259 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.551320 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-mtm9t"] Feb 17 09:15:14 crc kubenswrapper[4848]: E0217 09:15:14.552068 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="kube-rbac-proxy-node" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.552342 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="kube-rbac-proxy-node" Feb 17 09:15:14 crc kubenswrapper[4848]: E0217 09:15:14.552481 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="sbdb" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.552608 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="sbdb" Feb 17 09:15:14 crc kubenswrapper[4848]: E0217 09:15:14.552821 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="ovnkube-controller" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.552989 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="ovnkube-controller" Feb 17 09:15:14 crc kubenswrapper[4848]: E0217 09:15:14.553135 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="northd" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.553277 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="northd" Feb 17 09:15:14 crc kubenswrapper[4848]: E0217 09:15:14.553424 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="ovn-controller" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.553564 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="ovn-controller" Feb 17 09:15:14 crc kubenswrapper[4848]: E0217 09:15:14.554234 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="ovn-acl-logging" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.554396 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="ovn-acl-logging" Feb 17 09:15:14 crc kubenswrapper[4848]: E0217 09:15:14.554504 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="ovnkube-controller" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.554601 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="ovnkube-controller" Feb 17 09:15:14 crc kubenswrapper[4848]: E0217 09:15:14.554701 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="ovnkube-controller" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.554844 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="ovnkube-controller" Feb 17 09:15:14 crc kubenswrapper[4848]: E0217 09:15:14.554967 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="ovnkube-controller" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.555082 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="ovnkube-controller" Feb 17 09:15:14 crc kubenswrapper[4848]: E0217 09:15:14.555194 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="nbdb" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.555295 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="nbdb" Feb 17 09:15:14 crc kubenswrapper[4848]: E0217 09:15:14.555403 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.555504 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 09:15:14 crc kubenswrapper[4848]: E0217 09:15:14.555604 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="kubecfg-setup" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.555709 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="kubecfg-setup" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.556046 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="ovnkube-controller" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.556176 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="ovnkube-controller" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.556277 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="nbdb" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.556378 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="sbdb" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.556478 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="northd" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.556582 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="kube-rbac-proxy-ovn-metrics" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.556703 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="ovn-acl-logging" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.556880 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="ovn-controller" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.557000 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="ovnkube-controller" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.557134 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="kube-rbac-proxy-node" Feb 17 09:15:14 crc kubenswrapper[4848]: E0217 09:15:14.557428 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="ovnkube-controller" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.557538 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="ovnkube-controller" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.557869 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="ovnkube-controller" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.558370 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerName="ovnkube-controller" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.561541 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.650154 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6rgmx_ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6/kube-multus/2.log" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.650792 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6rgmx_ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6/kube-multus/1.log" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.650855 4848 generic.go:334] "Generic (PLEG): container finished" podID="ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6" containerID="c2340b994635fb61af0accf68fe141fcf64aaa0467d09e7a83fdf9cbd2ea65ed" exitCode=2 Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.650965 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6rgmx" event={"ID":"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6","Type":"ContainerDied","Data":"c2340b994635fb61af0accf68fe141fcf64aaa0467d09e7a83fdf9cbd2ea65ed"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.651034 4848 scope.go:117] "RemoveContainer" containerID="819c45bf58d7144faff1ddfb973187d2fa7bced89ad4d5d32ad750d5dc4f39ab" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.651893 4848 scope.go:117] "RemoveContainer" containerID="c2340b994635fb61af0accf68fe141fcf64aaa0467d09e7a83fdf9cbd2ea65ed" Feb 17 09:15:14 crc kubenswrapper[4848]: E0217 09:15:14.652750 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-6rgmx_openshift-multus(ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6)\"" pod="openshift-multus/multus-6rgmx" podUID="ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.653974 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4fvgf_2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8/ovnkube-controller/3.log" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.656608 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4fvgf_2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8/ovn-acl-logging/0.log" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.657249 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-4fvgf_2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8/ovn-controller/0.log" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.657731 4848 generic.go:334] "Generic (PLEG): container finished" podID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerID="50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b" exitCode=0 Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.657811 4848 generic.go:334] "Generic (PLEG): container finished" podID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerID="d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5" exitCode=0 Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.657828 4848 generic.go:334] "Generic (PLEG): container finished" podID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerID="0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3" exitCode=0 Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.657845 4848 generic.go:334] "Generic (PLEG): container finished" podID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerID="cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6" exitCode=0 Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.657859 4848 generic.go:334] "Generic (PLEG): container finished" podID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerID="46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b" exitCode=0 Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.657873 4848 generic.go:334] "Generic (PLEG): container finished" podID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerID="79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4" exitCode=0 Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.657886 4848 generic.go:334] "Generic (PLEG): container finished" podID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerID="8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b" exitCode=143 Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.657899 4848 generic.go:334] "Generic (PLEG): container finished" podID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" containerID="37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7" exitCode=143 Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.657932 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerDied","Data":"50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.657972 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerDied","Data":"d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.657994 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerDied","Data":"0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658015 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerDied","Data":"cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658035 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerDied","Data":"46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658056 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerDied","Data":"79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658076 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658094 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658107 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658119 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658131 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658143 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658155 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658167 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658178 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658189 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658205 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerDied","Data":"8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658225 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658238 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658250 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658261 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658272 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658283 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658295 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658307 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658318 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658330 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658348 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerDied","Data":"37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658366 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658380 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658392 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658405 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658418 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658430 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658438 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658442 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658590 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658605 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658617 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658635 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-4fvgf" event={"ID":"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8","Type":"ContainerDied","Data":"c9750d1add3ad0fb790a8bb1dc39fcd028bf1a3fe185f34e473c0dc886886286"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658655 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658668 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658680 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658692 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658703 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658714 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658726 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658737 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658750 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.658782 4848 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16"} Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.659150 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-kubelet\") pod \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.659197 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-var-lib-openvswitch\") pod \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.659225 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-run-systemd\") pod \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.659274 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.659292 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" (UID: "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.659312 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" (UID: "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.659328 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-ovnkube-config\") pod \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.659355 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" (UID: "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.659361 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-node-log\") pod \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.659402 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-node-log" (OuterVolumeSpecName: "node-log") pod "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" (UID: "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.659430 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-ovnkube-script-lib\") pod \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.659468 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-run-openvswitch\") pod \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.659507 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-systemd-units\") pod \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.659522 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" (UID: "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.659543 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwttz\" (UniqueName: \"kubernetes.io/projected/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-kube-api-access-lwttz\") pod \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.659565 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" (UID: "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.659579 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-env-overrides\") pod \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.659620 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-etc-openvswitch\") pod \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.659690 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-run-netns\") pod \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.659727 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-cni-netd\") pod \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.659802 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-run-ovn\") pod \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.659832 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" (UID: "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.659842 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" (UID: "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.659872 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" (UID: "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.659924 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" (UID: "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.660152 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-run-ovn-kubernetes\") pod \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.660237 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-ovn-node-metrics-cert\") pod \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.660283 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-cni-bin\") pod \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.660233 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" (UID: "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.660336 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-slash\") pod \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.660363 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-log-socket\") pod \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\" (UID: \"2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8\") " Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.660389 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" (UID: "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.660428 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-slash" (OuterVolumeSpecName: "host-slash") pod "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" (UID: "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.660436 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" (UID: "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.660507 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-log-socket" (OuterVolumeSpecName: "log-socket") pod "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" (UID: "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.660593 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-host-cni-bin\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.660665 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-host-cni-netd\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.660730 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-host-run-netns\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.660750 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" (UID: "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.660809 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5d6bca7-91a2-456f-a858-b88a290aadf2-env-overrides\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.661121 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-systemd-units\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.661197 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-host-kubelet\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.661254 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-etc-openvswitch\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.661304 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-var-lib-openvswitch\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.661404 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-run-openvswitch\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.661476 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" (UID: "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.661483 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5d6bca7-91a2-456f-a858-b88a290aadf2-ovnkube-config\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.661569 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-host-run-ovn-kubernetes\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.661711 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-log-socket\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.661798 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-host-slash\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.661856 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-run-ovn\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.661987 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5d6bca7-91a2-456f-a858-b88a290aadf2-ovn-node-metrics-cert\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.662017 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5d6bca7-91a2-456f-a858-b88a290aadf2-ovnkube-script-lib\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.662071 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-node-log\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.662091 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttffh\" (UniqueName: \"kubernetes.io/projected/c5d6bca7-91a2-456f-a858-b88a290aadf2-kube-api-access-ttffh\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.662113 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-run-systemd\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.662135 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.662202 4848 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.662216 4848 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-slash\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.662227 4848 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-log-socket\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.662252 4848 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.662294 4848 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.662327 4848 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.662348 4848 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.662366 4848 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-node-log\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.662382 4848 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.662399 4848 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.662416 4848 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.662431 4848 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.662448 4848 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.662466 4848 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.662482 4848 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.662500 4848 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.662515 4848 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.669041 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" (UID: "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.670201 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-kube-api-access-lwttz" (OuterVolumeSpecName: "kube-api-access-lwttz") pod "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" (UID: "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8"). InnerVolumeSpecName "kube-api-access-lwttz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.688936 4848 scope.go:117] "RemoveContainer" containerID="50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.691543 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" (UID: "2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.711791 4848 scope.go:117] "RemoveContainer" containerID="fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.738010 4848 scope.go:117] "RemoveContainer" containerID="d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.758876 4848 scope.go:117] "RemoveContainer" containerID="0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.763845 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5d6bca7-91a2-456f-a858-b88a290aadf2-ovnkube-config\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.763889 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-host-run-ovn-kubernetes\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.763937 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-log-socket\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.763970 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-host-slash\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.763999 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-run-ovn\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.764035 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5d6bca7-91a2-456f-a858-b88a290aadf2-ovn-node-metrics-cert\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.764068 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5d6bca7-91a2-456f-a858-b88a290aadf2-ovnkube-script-lib\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.764107 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttffh\" (UniqueName: \"kubernetes.io/projected/c5d6bca7-91a2-456f-a858-b88a290aadf2-kube-api-access-ttffh\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.764106 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-host-run-ovn-kubernetes\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.764136 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-node-log\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.764208 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-log-socket\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.764250 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-run-systemd\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.764289 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-run-ovn\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.764305 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.764375 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-host-cni-bin\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.764422 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-host-cni-netd\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.764493 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5d6bca7-91a2-456f-a858-b88a290aadf2-env-overrides\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.764528 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-host-run-netns\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.764569 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-systemd-units\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.764624 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-host-kubelet\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.764678 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-etc-openvswitch\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.764725 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-var-lib-openvswitch\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.764825 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-run-openvswitch\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.764949 4848 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.764957 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5d6bca7-91a2-456f-a858-b88a290aadf2-ovnkube-config\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.764973 4848 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.764995 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwttz\" (UniqueName: \"kubernetes.io/projected/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8-kube-api-access-lwttz\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.765047 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-run-openvswitch\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.765064 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-host-cni-netd\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.764178 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-node-log\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.765124 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-run-systemd\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.764259 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-host-slash\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.765164 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.765194 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-host-cni-bin\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.765483 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-host-kubelet\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.765517 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-host-run-netns\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.765545 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-systemd-units\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.765571 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-etc-openvswitch\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.765637 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d6bca7-91a2-456f-a858-b88a290aadf2-var-lib-openvswitch\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.765982 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5d6bca7-91a2-456f-a858-b88a290aadf2-env-overrides\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.766139 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5d6bca7-91a2-456f-a858-b88a290aadf2-ovnkube-script-lib\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.770408 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5d6bca7-91a2-456f-a858-b88a290aadf2-ovn-node-metrics-cert\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.777837 4848 scope.go:117] "RemoveContainer" containerID="cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.800292 4848 scope.go:117] "RemoveContainer" containerID="46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.800505 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttffh\" (UniqueName: \"kubernetes.io/projected/c5d6bca7-91a2-456f-a858-b88a290aadf2-kube-api-access-ttffh\") pod \"ovnkube-node-mtm9t\" (UID: \"c5d6bca7-91a2-456f-a858-b88a290aadf2\") " pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.818346 4848 scope.go:117] "RemoveContainer" containerID="79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.839154 4848 scope.go:117] "RemoveContainer" containerID="8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.858708 4848 scope.go:117] "RemoveContainer" containerID="37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.874132 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.876799 4848 scope.go:117] "RemoveContainer" containerID="32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.898003 4848 scope.go:117] "RemoveContainer" containerID="50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b" Feb 17 09:15:14 crc kubenswrapper[4848]: E0217 09:15:14.898670 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b\": container with ID starting with 50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b not found: ID does not exist" containerID="50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.898820 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b"} err="failed to get container status \"50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b\": rpc error: code = NotFound desc = could not find container \"50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b\": container with ID starting with 50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.898935 4848 scope.go:117] "RemoveContainer" containerID="fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624" Feb 17 09:15:14 crc kubenswrapper[4848]: E0217 09:15:14.899610 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624\": container with ID starting with fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624 not found: ID does not exist" containerID="fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.899679 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624"} err="failed to get container status \"fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624\": rpc error: code = NotFound desc = could not find container \"fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624\": container with ID starting with fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.899720 4848 scope.go:117] "RemoveContainer" containerID="d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5" Feb 17 09:15:14 crc kubenswrapper[4848]: E0217 09:15:14.900281 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\": container with ID starting with d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5 not found: ID does not exist" containerID="d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.900352 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5"} err="failed to get container status \"d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\": rpc error: code = NotFound desc = could not find container \"d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\": container with ID starting with d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.900384 4848 scope.go:117] "RemoveContainer" containerID="0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3" Feb 17 09:15:14 crc kubenswrapper[4848]: E0217 09:15:14.900705 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\": container with ID starting with 0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3 not found: ID does not exist" containerID="0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.900744 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3"} err="failed to get container status \"0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\": rpc error: code = NotFound desc = could not find container \"0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\": container with ID starting with 0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.900795 4848 scope.go:117] "RemoveContainer" containerID="cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6" Feb 17 09:15:14 crc kubenswrapper[4848]: E0217 09:15:14.901354 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\": container with ID starting with cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6 not found: ID does not exist" containerID="cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.901449 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6"} err="failed to get container status \"cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\": rpc error: code = NotFound desc = could not find container \"cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\": container with ID starting with cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.901560 4848 scope.go:117] "RemoveContainer" containerID="46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b" Feb 17 09:15:14 crc kubenswrapper[4848]: E0217 09:15:14.902333 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\": container with ID starting with 46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b not found: ID does not exist" containerID="46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.902375 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b"} err="failed to get container status \"46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\": rpc error: code = NotFound desc = could not find container \"46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\": container with ID starting with 46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.902403 4848 scope.go:117] "RemoveContainer" containerID="79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4" Feb 17 09:15:14 crc kubenswrapper[4848]: E0217 09:15:14.902815 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\": container with ID starting with 79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4 not found: ID does not exist" containerID="79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.902956 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4"} err="failed to get container status \"79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\": rpc error: code = NotFound desc = could not find container \"79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\": container with ID starting with 79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.903045 4848 scope.go:117] "RemoveContainer" containerID="8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b" Feb 17 09:15:14 crc kubenswrapper[4848]: E0217 09:15:14.903485 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\": container with ID starting with 8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b not found: ID does not exist" containerID="8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.903527 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b"} err="failed to get container status \"8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\": rpc error: code = NotFound desc = could not find container \"8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\": container with ID starting with 8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.903555 4848 scope.go:117] "RemoveContainer" containerID="37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7" Feb 17 09:15:14 crc kubenswrapper[4848]: E0217 09:15:14.906426 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\": container with ID starting with 37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7 not found: ID does not exist" containerID="37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.906603 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7"} err="failed to get container status \"37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\": rpc error: code = NotFound desc = could not find container \"37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\": container with ID starting with 37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.906693 4848 scope.go:117] "RemoveContainer" containerID="32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16" Feb 17 09:15:14 crc kubenswrapper[4848]: E0217 09:15:14.907137 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\": container with ID starting with 32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16 not found: ID does not exist" containerID="32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.907246 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16"} err="failed to get container status \"32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\": rpc error: code = NotFound desc = could not find container \"32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\": container with ID starting with 32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.907347 4848 scope.go:117] "RemoveContainer" containerID="50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.907720 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b"} err="failed to get container status \"50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b\": rpc error: code = NotFound desc = could not find container \"50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b\": container with ID starting with 50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.907790 4848 scope.go:117] "RemoveContainer" containerID="fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.908156 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624"} err="failed to get container status \"fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624\": rpc error: code = NotFound desc = could not find container \"fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624\": container with ID starting with fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.908260 4848 scope.go:117] "RemoveContainer" containerID="d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.908638 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5"} err="failed to get container status \"d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\": rpc error: code = NotFound desc = could not find container \"d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\": container with ID starting with d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.908781 4848 scope.go:117] "RemoveContainer" containerID="0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.910164 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3"} err="failed to get container status \"0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\": rpc error: code = NotFound desc = could not find container \"0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\": container with ID starting with 0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.910200 4848 scope.go:117] "RemoveContainer" containerID="cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.910934 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6"} err="failed to get container status \"cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\": rpc error: code = NotFound desc = could not find container \"cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\": container with ID starting with cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.911004 4848 scope.go:117] "RemoveContainer" containerID="46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.911708 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b"} err="failed to get container status \"46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\": rpc error: code = NotFound desc = could not find container \"46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\": container with ID starting with 46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.911804 4848 scope.go:117] "RemoveContainer" containerID="79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.912159 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4"} err="failed to get container status \"79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\": rpc error: code = NotFound desc = could not find container \"79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\": container with ID starting with 79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.912200 4848 scope.go:117] "RemoveContainer" containerID="8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.912851 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b"} err="failed to get container status \"8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\": rpc error: code = NotFound desc = could not find container \"8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\": container with ID starting with 8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.912894 4848 scope.go:117] "RemoveContainer" containerID="37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.913352 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7"} err="failed to get container status \"37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\": rpc error: code = NotFound desc = could not find container \"37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\": container with ID starting with 37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.913435 4848 scope.go:117] "RemoveContainer" containerID="32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.913836 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16"} err="failed to get container status \"32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\": rpc error: code = NotFound desc = could not find container \"32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\": container with ID starting with 32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.913883 4848 scope.go:117] "RemoveContainer" containerID="50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.914823 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b"} err="failed to get container status \"50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b\": rpc error: code = NotFound desc = could not find container \"50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b\": container with ID starting with 50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.914864 4848 scope.go:117] "RemoveContainer" containerID="fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.915311 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624"} err="failed to get container status \"fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624\": rpc error: code = NotFound desc = could not find container \"fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624\": container with ID starting with fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.915344 4848 scope.go:117] "RemoveContainer" containerID="d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.917900 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5"} err="failed to get container status \"d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\": rpc error: code = NotFound desc = could not find container \"d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\": container with ID starting with d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.917944 4848 scope.go:117] "RemoveContainer" containerID="0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.918449 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3"} err="failed to get container status \"0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\": rpc error: code = NotFound desc = could not find container \"0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\": container with ID starting with 0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.918537 4848 scope.go:117] "RemoveContainer" containerID="cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.919154 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6"} err="failed to get container status \"cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\": rpc error: code = NotFound desc = could not find container \"cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\": container with ID starting with cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.919199 4848 scope.go:117] "RemoveContainer" containerID="46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.919732 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b"} err="failed to get container status \"46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\": rpc error: code = NotFound desc = could not find container \"46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\": container with ID starting with 46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.919810 4848 scope.go:117] "RemoveContainer" containerID="79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.920258 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4"} err="failed to get container status \"79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\": rpc error: code = NotFound desc = could not find container \"79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\": container with ID starting with 79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.920300 4848 scope.go:117] "RemoveContainer" containerID="8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.920720 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b"} err="failed to get container status \"8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\": rpc error: code = NotFound desc = could not find container \"8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\": container with ID starting with 8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.920794 4848 scope.go:117] "RemoveContainer" containerID="37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.921255 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7"} err="failed to get container status \"37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\": rpc error: code = NotFound desc = could not find container \"37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\": container with ID starting with 37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.921298 4848 scope.go:117] "RemoveContainer" containerID="32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.921950 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16"} err="failed to get container status \"32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\": rpc error: code = NotFound desc = could not find container \"32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\": container with ID starting with 32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.921993 4848 scope.go:117] "RemoveContainer" containerID="50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.922408 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b"} err="failed to get container status \"50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b\": rpc error: code = NotFound desc = could not find container \"50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b\": container with ID starting with 50d4b5d352fbcb0aa5fef6e7c79ebb53ba4edbfaf07b71561ccb8c6b165bfd6b not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.922452 4848 scope.go:117] "RemoveContainer" containerID="fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.923107 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624"} err="failed to get container status \"fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624\": rpc error: code = NotFound desc = could not find container \"fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624\": container with ID starting with fc8bb9922c5f6ed96cdd9962aa33ff937209d6073ffaa9e0c3ae480303211624 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.923173 4848 scope.go:117] "RemoveContainer" containerID="d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.923662 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5"} err="failed to get container status \"d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\": rpc error: code = NotFound desc = could not find container \"d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5\": container with ID starting with d5c084a4ad2949d45c4c367785912ef9fcbb86644c59598066eb8b816831ccc5 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.923708 4848 scope.go:117] "RemoveContainer" containerID="0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.924122 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3"} err="failed to get container status \"0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\": rpc error: code = NotFound desc = could not find container \"0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3\": container with ID starting with 0ce24a0f2f871afde982d67e8b06600b7bc977c0c28472b30e9e6408bd5551b3 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.924180 4848 scope.go:117] "RemoveContainer" containerID="cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.924738 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6"} err="failed to get container status \"cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\": rpc error: code = NotFound desc = could not find container \"cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6\": container with ID starting with cd31de9aadf7291b679d4556a1b3be65bdf551f9f91e254d79086e30e9fda6a6 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.924804 4848 scope.go:117] "RemoveContainer" containerID="46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.925417 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b"} err="failed to get container status \"46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\": rpc error: code = NotFound desc = could not find container \"46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b\": container with ID starting with 46d73b2669d48acdb4a8dcd3eba58729bd332d80994bb2bf74e7fa25a58da28b not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.925470 4848 scope.go:117] "RemoveContainer" containerID="79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.933158 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4"} err="failed to get container status \"79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\": rpc error: code = NotFound desc = could not find container \"79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4\": container with ID starting with 79c0bb3e1aed0cbbb1f019d7e2ee5f05de718f8189cba9d28e6f039a1f5d77d4 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.933204 4848 scope.go:117] "RemoveContainer" containerID="8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.933615 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b"} err="failed to get container status \"8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\": rpc error: code = NotFound desc = could not find container \"8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b\": container with ID starting with 8759b3d84e4a171f9bede3c03d51512b915189d4b4980dd76ceaac3f2093b62b not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.933677 4848 scope.go:117] "RemoveContainer" containerID="37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.934035 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7"} err="failed to get container status \"37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\": rpc error: code = NotFound desc = could not find container \"37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7\": container with ID starting with 37ef6976436a077e27e85071a01794b7d16be021785b9a6252822ceb53c04ff7 not found: ID does not exist" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.934143 4848 scope.go:117] "RemoveContainer" containerID="32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16" Feb 17 09:15:14 crc kubenswrapper[4848]: I0217 09:15:14.934810 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16"} err="failed to get container status \"32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\": rpc error: code = NotFound desc = could not find container \"32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16\": container with ID starting with 32f87cc5ea35183f7c0ef5aa219558f86f96264297bb8722d20987767085cf16 not found: ID does not exist" Feb 17 09:15:15 crc kubenswrapper[4848]: I0217 09:15:15.037932 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4fvgf"] Feb 17 09:15:15 crc kubenswrapper[4848]: I0217 09:15:15.048414 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-4fvgf"] Feb 17 09:15:15 crc kubenswrapper[4848]: I0217 09:15:15.396959 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8" path="/var/lib/kubelet/pods/2f1c8d8b-192e-4f7f-a78a-0ed9b92e80e8/volumes" Feb 17 09:15:15 crc kubenswrapper[4848]: I0217 09:15:15.669209 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6rgmx_ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6/kube-multus/2.log" Feb 17 09:15:15 crc kubenswrapper[4848]: I0217 09:15:15.674437 4848 generic.go:334] "Generic (PLEG): container finished" podID="c5d6bca7-91a2-456f-a858-b88a290aadf2" containerID="12e0f5adffcaaaf4cbf0c75de94275eaa9c5007bc721e048463682eb7a43ce4f" exitCode=0 Feb 17 09:15:15 crc kubenswrapper[4848]: I0217 09:15:15.674504 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" event={"ID":"c5d6bca7-91a2-456f-a858-b88a290aadf2","Type":"ContainerDied","Data":"12e0f5adffcaaaf4cbf0c75de94275eaa9c5007bc721e048463682eb7a43ce4f"} Feb 17 09:15:15 crc kubenswrapper[4848]: I0217 09:15:15.674576 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" event={"ID":"c5d6bca7-91a2-456f-a858-b88a290aadf2","Type":"ContainerStarted","Data":"63cae094feb441697408533ec794c0deead92195ec68244c9865480fbfdd139a"} Feb 17 09:15:16 crc kubenswrapper[4848]: I0217 09:15:16.685269 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" event={"ID":"c5d6bca7-91a2-456f-a858-b88a290aadf2","Type":"ContainerStarted","Data":"527f364cb8d4771f7a1f5da9383a39254a6b7702e422fc41e8d021be09b55863"} Feb 17 09:15:16 crc kubenswrapper[4848]: I0217 09:15:16.685946 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" event={"ID":"c5d6bca7-91a2-456f-a858-b88a290aadf2","Type":"ContainerStarted","Data":"af184839116ab7effcf4f464386f2c8562a2951c84ffe323fc7d538474cec0a0"} Feb 17 09:15:16 crc kubenswrapper[4848]: I0217 09:15:16.685973 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" event={"ID":"c5d6bca7-91a2-456f-a858-b88a290aadf2","Type":"ContainerStarted","Data":"7df8ea068dcc42a0ff900dc108f280ed39183d6c3aaaee101fdcc9d956dfe621"} Feb 17 09:15:16 crc kubenswrapper[4848]: I0217 09:15:16.685995 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" event={"ID":"c5d6bca7-91a2-456f-a858-b88a290aadf2","Type":"ContainerStarted","Data":"21605186db098207074f37250a0e758b97ff2dd01e329197ff827ef7a7c97089"} Feb 17 09:15:16 crc kubenswrapper[4848]: I0217 09:15:16.686014 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" event={"ID":"c5d6bca7-91a2-456f-a858-b88a290aadf2","Type":"ContainerStarted","Data":"d6561e62aa66004e0599f4a2119cd6edf3e6063f849840b8dfddffb8cd044e4d"} Feb 17 09:15:16 crc kubenswrapper[4848]: I0217 09:15:16.686033 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" event={"ID":"c5d6bca7-91a2-456f-a858-b88a290aadf2","Type":"ContainerStarted","Data":"09aab9d44e3163d01b1e4540a841178fe6f44d68e9b71acf75ee521dd4244394"} Feb 17 09:15:18 crc kubenswrapper[4848]: I0217 09:15:18.771924 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:15:18 crc kubenswrapper[4848]: I0217 09:15:18.772018 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:15:19 crc kubenswrapper[4848]: I0217 09:15:19.711113 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" event={"ID":"c5d6bca7-91a2-456f-a858-b88a290aadf2","Type":"ContainerStarted","Data":"5275dc30577ceb1e7fd063c57866cf71078ae897a7c59443888683feb14f0b59"} Feb 17 09:15:21 crc kubenswrapper[4848]: I0217 09:15:21.743722 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" event={"ID":"c5d6bca7-91a2-456f-a858-b88a290aadf2","Type":"ContainerStarted","Data":"059e3846acf8de6f9a05890b57f935e49cd0171568faedac1c4098c226df5664"} Feb 17 09:15:21 crc kubenswrapper[4848]: I0217 09:15:21.744156 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:21 crc kubenswrapper[4848]: I0217 09:15:21.744230 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:21 crc kubenswrapper[4848]: I0217 09:15:21.744242 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:21 crc kubenswrapper[4848]: I0217 09:15:21.775149 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:21 crc kubenswrapper[4848]: I0217 09:15:21.778978 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:21 crc kubenswrapper[4848]: I0217 09:15:21.790165 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" podStartSLOduration=7.790148357 podStartE2EDuration="7.790148357s" podCreationTimestamp="2026-02-17 09:15:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:15:21.787562442 +0000 UTC m=+599.330818108" watchObservedRunningTime="2026-02-17 09:15:21.790148357 +0000 UTC m=+599.333404023" Feb 17 09:15:23 crc kubenswrapper[4848]: I0217 09:15:23.674035 4848 scope.go:117] "RemoveContainer" containerID="d4a4bce4d9a008f93559bead769e80155e1bc4a0282ea23e8266cd19751ea42c" Feb 17 09:15:29 crc kubenswrapper[4848]: I0217 09:15:29.383623 4848 scope.go:117] "RemoveContainer" containerID="c2340b994635fb61af0accf68fe141fcf64aaa0467d09e7a83fdf9cbd2ea65ed" Feb 17 09:15:29 crc kubenswrapper[4848]: E0217 09:15:29.384670 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-6rgmx_openshift-multus(ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6)\"" pod="openshift-multus/multus-6rgmx" podUID="ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6" Feb 17 09:15:42 crc kubenswrapper[4848]: I0217 09:15:42.383600 4848 scope.go:117] "RemoveContainer" containerID="c2340b994635fb61af0accf68fe141fcf64aaa0467d09e7a83fdf9cbd2ea65ed" Feb 17 09:15:42 crc kubenswrapper[4848]: I0217 09:15:42.898659 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-6rgmx_ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6/kube-multus/2.log" Feb 17 09:15:42 crc kubenswrapper[4848]: I0217 09:15:42.901127 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-6rgmx" event={"ID":"ddcd58be-cbc2-4c49-b9fd-75e8d53e6ce6","Type":"ContainerStarted","Data":"7e19c28edca78a4832f52801444a7f2fd35c32aa1b298309f435c3756b245eb3"} Feb 17 09:15:44 crc kubenswrapper[4848]: I0217 09:15:44.913339 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mtm9t" Feb 17 09:15:48 crc kubenswrapper[4848]: I0217 09:15:48.772216 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:15:48 crc kubenswrapper[4848]: I0217 09:15:48.772737 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:15:48 crc kubenswrapper[4848]: I0217 09:15:48.772878 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:15:48 crc kubenswrapper[4848]: I0217 09:15:48.774011 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"22658753e35aaed607fe5a9bec19a88c03d7cd46abf15a9cf79a3c5b734a8c5c"} pod="openshift-machine-config-operator/machine-config-daemon-stvnz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 09:15:48 crc kubenswrapper[4848]: I0217 09:15:48.774124 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" containerID="cri-o://22658753e35aaed607fe5a9bec19a88c03d7cd46abf15a9cf79a3c5b734a8c5c" gracePeriod=600 Feb 17 09:15:49 crc kubenswrapper[4848]: I0217 09:15:49.961520 4848 generic.go:334] "Generic (PLEG): container finished" podID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerID="22658753e35aaed607fe5a9bec19a88c03d7cd46abf15a9cf79a3c5b734a8c5c" exitCode=0 Feb 17 09:15:49 crc kubenswrapper[4848]: I0217 09:15:49.961561 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerDied","Data":"22658753e35aaed607fe5a9bec19a88c03d7cd46abf15a9cf79a3c5b734a8c5c"} Feb 17 09:15:49 crc kubenswrapper[4848]: I0217 09:15:49.961590 4848 scope.go:117] "RemoveContainer" containerID="c10c26abd2aa6ffa5969a35af04b1b8e427c06a994c5d6c12ac096d788cf26a4" Feb 17 09:15:50 crc kubenswrapper[4848]: I0217 09:15:50.737932 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28"] Feb 17 09:15:50 crc kubenswrapper[4848]: I0217 09:15:50.740005 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28" Feb 17 09:15:50 crc kubenswrapper[4848]: I0217 09:15:50.743256 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 09:15:50 crc kubenswrapper[4848]: I0217 09:15:50.755377 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28"] Feb 17 09:15:50 crc kubenswrapper[4848]: I0217 09:15:50.884542 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8eb1e57e-4c70-41bf-a650-989b432ce3b6-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28\" (UID: \"8eb1e57e-4c70-41bf-a650-989b432ce3b6\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28" Feb 17 09:15:50 crc kubenswrapper[4848]: I0217 09:15:50.884660 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8eb1e57e-4c70-41bf-a650-989b432ce3b6-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28\" (UID: \"8eb1e57e-4c70-41bf-a650-989b432ce3b6\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28" Feb 17 09:15:50 crc kubenswrapper[4848]: I0217 09:15:50.884713 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp7h9\" (UniqueName: \"kubernetes.io/projected/8eb1e57e-4c70-41bf-a650-989b432ce3b6-kube-api-access-cp7h9\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28\" (UID: \"8eb1e57e-4c70-41bf-a650-989b432ce3b6\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28" Feb 17 09:15:50 crc kubenswrapper[4848]: I0217 09:15:50.971857 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerStarted","Data":"cf7c734d597165a992ca275dfa403ad67456d929b1c93b35482f6a777604c954"} Feb 17 09:15:50 crc kubenswrapper[4848]: I0217 09:15:50.985859 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8eb1e57e-4c70-41bf-a650-989b432ce3b6-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28\" (UID: \"8eb1e57e-4c70-41bf-a650-989b432ce3b6\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28" Feb 17 09:15:50 crc kubenswrapper[4848]: I0217 09:15:50.985951 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp7h9\" (UniqueName: \"kubernetes.io/projected/8eb1e57e-4c70-41bf-a650-989b432ce3b6-kube-api-access-cp7h9\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28\" (UID: \"8eb1e57e-4c70-41bf-a650-989b432ce3b6\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28" Feb 17 09:15:50 crc kubenswrapper[4848]: I0217 09:15:50.986505 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8eb1e57e-4c70-41bf-a650-989b432ce3b6-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28\" (UID: \"8eb1e57e-4c70-41bf-a650-989b432ce3b6\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28" Feb 17 09:15:50 crc kubenswrapper[4848]: I0217 09:15:50.987185 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8eb1e57e-4c70-41bf-a650-989b432ce3b6-bundle\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28\" (UID: \"8eb1e57e-4c70-41bf-a650-989b432ce3b6\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28" Feb 17 09:15:50 crc kubenswrapper[4848]: I0217 09:15:50.987483 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8eb1e57e-4c70-41bf-a650-989b432ce3b6-util\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28\" (UID: \"8eb1e57e-4c70-41bf-a650-989b432ce3b6\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28" Feb 17 09:15:51 crc kubenswrapper[4848]: I0217 09:15:51.020510 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp7h9\" (UniqueName: \"kubernetes.io/projected/8eb1e57e-4c70-41bf-a650-989b432ce3b6-kube-api-access-cp7h9\") pod \"f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28\" (UID: \"8eb1e57e-4c70-41bf-a650-989b432ce3b6\") " pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28" Feb 17 09:15:51 crc kubenswrapper[4848]: I0217 09:15:51.068563 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28" Feb 17 09:15:51 crc kubenswrapper[4848]: I0217 09:15:51.372019 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28"] Feb 17 09:15:51 crc kubenswrapper[4848]: I0217 09:15:51.983929 4848 generic.go:334] "Generic (PLEG): container finished" podID="8eb1e57e-4c70-41bf-a650-989b432ce3b6" containerID="1cb5740633edc427edd3640e2ff44d38b02d2daccc375d3f00cb77557c7c6ff3" exitCode=0 Feb 17 09:15:51 crc kubenswrapper[4848]: I0217 09:15:51.984002 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28" event={"ID":"8eb1e57e-4c70-41bf-a650-989b432ce3b6","Type":"ContainerDied","Data":"1cb5740633edc427edd3640e2ff44d38b02d2daccc375d3f00cb77557c7c6ff3"} Feb 17 09:15:51 crc kubenswrapper[4848]: I0217 09:15:51.984719 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28" event={"ID":"8eb1e57e-4c70-41bf-a650-989b432ce3b6","Type":"ContainerStarted","Data":"8124e157addd24ecd50c301ffb9d5533efa209fed3a0f89d078c6878cdf43123"} Feb 17 09:15:54 crc kubenswrapper[4848]: I0217 09:15:54.002693 4848 generic.go:334] "Generic (PLEG): container finished" podID="8eb1e57e-4c70-41bf-a650-989b432ce3b6" containerID="8d02e65b2781eec1903b23a72706d68b1f60863da198cc1fee634ba2f91e38be" exitCode=0 Feb 17 09:15:54 crc kubenswrapper[4848]: I0217 09:15:54.002806 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28" event={"ID":"8eb1e57e-4c70-41bf-a650-989b432ce3b6","Type":"ContainerDied","Data":"8d02e65b2781eec1903b23a72706d68b1f60863da198cc1fee634ba2f91e38be"} Feb 17 09:15:55 crc kubenswrapper[4848]: I0217 09:15:55.016722 4848 generic.go:334] "Generic (PLEG): container finished" podID="8eb1e57e-4c70-41bf-a650-989b432ce3b6" containerID="29fc7570b93eec22a2742ec240dea76aaef9ee1abbce8ff46cd9f8a63700df86" exitCode=0 Feb 17 09:15:55 crc kubenswrapper[4848]: I0217 09:15:55.016817 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28" event={"ID":"8eb1e57e-4c70-41bf-a650-989b432ce3b6","Type":"ContainerDied","Data":"29fc7570b93eec22a2742ec240dea76aaef9ee1abbce8ff46cd9f8a63700df86"} Feb 17 09:15:56 crc kubenswrapper[4848]: I0217 09:15:56.428589 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28" Feb 17 09:15:56 crc kubenswrapper[4848]: I0217 09:15:56.566671 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8eb1e57e-4c70-41bf-a650-989b432ce3b6-util\") pod \"8eb1e57e-4c70-41bf-a650-989b432ce3b6\" (UID: \"8eb1e57e-4c70-41bf-a650-989b432ce3b6\") " Feb 17 09:15:56 crc kubenswrapper[4848]: I0217 09:15:56.570121 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8eb1e57e-4c70-41bf-a650-989b432ce3b6-bundle\") pod \"8eb1e57e-4c70-41bf-a650-989b432ce3b6\" (UID: \"8eb1e57e-4c70-41bf-a650-989b432ce3b6\") " Feb 17 09:15:56 crc kubenswrapper[4848]: I0217 09:15:56.570276 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp7h9\" (UniqueName: \"kubernetes.io/projected/8eb1e57e-4c70-41bf-a650-989b432ce3b6-kube-api-access-cp7h9\") pod \"8eb1e57e-4c70-41bf-a650-989b432ce3b6\" (UID: \"8eb1e57e-4c70-41bf-a650-989b432ce3b6\") " Feb 17 09:15:56 crc kubenswrapper[4848]: I0217 09:15:56.571623 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eb1e57e-4c70-41bf-a650-989b432ce3b6-bundle" (OuterVolumeSpecName: "bundle") pod "8eb1e57e-4c70-41bf-a650-989b432ce3b6" (UID: "8eb1e57e-4c70-41bf-a650-989b432ce3b6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:15:56 crc kubenswrapper[4848]: I0217 09:15:56.578472 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8eb1e57e-4c70-41bf-a650-989b432ce3b6-kube-api-access-cp7h9" (OuterVolumeSpecName: "kube-api-access-cp7h9") pod "8eb1e57e-4c70-41bf-a650-989b432ce3b6" (UID: "8eb1e57e-4c70-41bf-a650-989b432ce3b6"). InnerVolumeSpecName "kube-api-access-cp7h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:15:56 crc kubenswrapper[4848]: I0217 09:15:56.588352 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8eb1e57e-4c70-41bf-a650-989b432ce3b6-util" (OuterVolumeSpecName: "util") pod "8eb1e57e-4c70-41bf-a650-989b432ce3b6" (UID: "8eb1e57e-4c70-41bf-a650-989b432ce3b6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:15:56 crc kubenswrapper[4848]: I0217 09:15:56.671447 4848 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8eb1e57e-4c70-41bf-a650-989b432ce3b6-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:56 crc kubenswrapper[4848]: I0217 09:15:56.671496 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp7h9\" (UniqueName: \"kubernetes.io/projected/8eb1e57e-4c70-41bf-a650-989b432ce3b6-kube-api-access-cp7h9\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:56 crc kubenswrapper[4848]: I0217 09:15:56.671518 4848 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8eb1e57e-4c70-41bf-a650-989b432ce3b6-util\") on node \"crc\" DevicePath \"\"" Feb 17 09:15:57 crc kubenswrapper[4848]: I0217 09:15:57.035138 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28" event={"ID":"8eb1e57e-4c70-41bf-a650-989b432ce3b6","Type":"ContainerDied","Data":"8124e157addd24ecd50c301ffb9d5533efa209fed3a0f89d078c6878cdf43123"} Feb 17 09:15:57 crc kubenswrapper[4848]: I0217 09:15:57.035185 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8124e157addd24ecd50c301ffb9d5533efa209fed3a0f89d078c6878cdf43123" Feb 17 09:15:57 crc kubenswrapper[4848]: I0217 09:15:57.035202 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28" Feb 17 09:15:58 crc kubenswrapper[4848]: I0217 09:15:58.434794 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-swpl6"] Feb 17 09:15:58 crc kubenswrapper[4848]: E0217 09:15:58.435284 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eb1e57e-4c70-41bf-a650-989b432ce3b6" containerName="extract" Feb 17 09:15:58 crc kubenswrapper[4848]: I0217 09:15:58.435295 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eb1e57e-4c70-41bf-a650-989b432ce3b6" containerName="extract" Feb 17 09:15:58 crc kubenswrapper[4848]: E0217 09:15:58.435305 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eb1e57e-4c70-41bf-a650-989b432ce3b6" containerName="pull" Feb 17 09:15:58 crc kubenswrapper[4848]: I0217 09:15:58.435311 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eb1e57e-4c70-41bf-a650-989b432ce3b6" containerName="pull" Feb 17 09:15:58 crc kubenswrapper[4848]: E0217 09:15:58.435331 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8eb1e57e-4c70-41bf-a650-989b432ce3b6" containerName="util" Feb 17 09:15:58 crc kubenswrapper[4848]: I0217 09:15:58.435337 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="8eb1e57e-4c70-41bf-a650-989b432ce3b6" containerName="util" Feb 17 09:15:58 crc kubenswrapper[4848]: I0217 09:15:58.435425 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="8eb1e57e-4c70-41bf-a650-989b432ce3b6" containerName="extract" Feb 17 09:15:58 crc kubenswrapper[4848]: I0217 09:15:58.435865 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-swpl6" Feb 17 09:15:58 crc kubenswrapper[4848]: I0217 09:15:58.437665 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 17 09:15:58 crc kubenswrapper[4848]: I0217 09:15:58.437818 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 17 09:15:58 crc kubenswrapper[4848]: I0217 09:15:58.438051 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-dzbj9" Feb 17 09:15:58 crc kubenswrapper[4848]: I0217 09:15:58.489199 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-swpl6"] Feb 17 09:15:58 crc kubenswrapper[4848]: I0217 09:15:58.597105 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l289b\" (UniqueName: \"kubernetes.io/projected/c406a2fd-a4a9-47fb-bfff-80324dae94c4-kube-api-access-l289b\") pod \"nmstate-operator-694c9596b7-swpl6\" (UID: \"c406a2fd-a4a9-47fb-bfff-80324dae94c4\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-swpl6" Feb 17 09:15:58 crc kubenswrapper[4848]: I0217 09:15:58.698500 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l289b\" (UniqueName: \"kubernetes.io/projected/c406a2fd-a4a9-47fb-bfff-80324dae94c4-kube-api-access-l289b\") pod \"nmstate-operator-694c9596b7-swpl6\" (UID: \"c406a2fd-a4a9-47fb-bfff-80324dae94c4\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-swpl6" Feb 17 09:15:58 crc kubenswrapper[4848]: I0217 09:15:58.731908 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l289b\" (UniqueName: \"kubernetes.io/projected/c406a2fd-a4a9-47fb-bfff-80324dae94c4-kube-api-access-l289b\") pod \"nmstate-operator-694c9596b7-swpl6\" (UID: \"c406a2fd-a4a9-47fb-bfff-80324dae94c4\") " pod="openshift-nmstate/nmstate-operator-694c9596b7-swpl6" Feb 17 09:15:58 crc kubenswrapper[4848]: I0217 09:15:58.753394 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-694c9596b7-swpl6" Feb 17 09:15:58 crc kubenswrapper[4848]: I0217 09:15:58.959031 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-694c9596b7-swpl6"] Feb 17 09:15:59 crc kubenswrapper[4848]: I0217 09:15:59.048228 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-swpl6" event={"ID":"c406a2fd-a4a9-47fb-bfff-80324dae94c4","Type":"ContainerStarted","Data":"4c96f1360d267d87200382f9f45fdcc8e09c62362f06b21a1d8b9f90519ee554"} Feb 17 09:16:02 crc kubenswrapper[4848]: I0217 09:16:02.069436 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-694c9596b7-swpl6" event={"ID":"c406a2fd-a4a9-47fb-bfff-80324dae94c4","Type":"ContainerStarted","Data":"fc58b3fe057f49f6f9827dec3f64f6150d08e338ddec4c6b013d5aa4bf572894"} Feb 17 09:16:02 crc kubenswrapper[4848]: I0217 09:16:02.101576 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-694c9596b7-swpl6" podStartSLOduration=2.022634498 podStartE2EDuration="4.101549277s" podCreationTimestamp="2026-02-17 09:15:58 +0000 UTC" firstStartedPulling="2026-02-17 09:15:58.969372541 +0000 UTC m=+636.512628187" lastFinishedPulling="2026-02-17 09:16:01.04828732 +0000 UTC m=+638.591542966" observedRunningTime="2026-02-17 09:16:02.092568599 +0000 UTC m=+639.635824265" watchObservedRunningTime="2026-02-17 09:16:02.101549277 +0000 UTC m=+639.644804953" Feb 17 09:16:02 crc kubenswrapper[4848]: I0217 09:16:02.940314 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-7txp2"] Feb 17 09:16:02 crc kubenswrapper[4848]: I0217 09:16:02.941652 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-7txp2" Feb 17 09:16:02 crc kubenswrapper[4848]: I0217 09:16:02.943727 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-bcwjg" Feb 17 09:16:02 crc kubenswrapper[4848]: I0217 09:16:02.948245 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-pl898"] Feb 17 09:16:02 crc kubenswrapper[4848]: I0217 09:16:02.948953 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pl898" Feb 17 09:16:02 crc kubenswrapper[4848]: I0217 09:16:02.950162 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 17 09:16:02 crc kubenswrapper[4848]: I0217 09:16:02.953361 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-7txp2"] Feb 17 09:16:02 crc kubenswrapper[4848]: I0217 09:16:02.962938 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9pxb\" (UniqueName: \"kubernetes.io/projected/989f6a1e-38ab-40a8-94aa-faadc620efca-kube-api-access-s9pxb\") pod \"nmstate-metrics-58c85c668d-7txp2\" (UID: \"989f6a1e-38ab-40a8-94aa-faadc620efca\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-7txp2" Feb 17 09:16:02 crc kubenswrapper[4848]: I0217 09:16:02.963018 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/08ae32ff-43fc-4536-b68a-45e4fd947a2d-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-pl898\" (UID: \"08ae32ff-43fc-4536-b68a-45e4fd947a2d\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pl898" Feb 17 09:16:02 crc kubenswrapper[4848]: I0217 09:16:02.963185 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4fpn\" (UniqueName: \"kubernetes.io/projected/08ae32ff-43fc-4536-b68a-45e4fd947a2d-kube-api-access-c4fpn\") pod \"nmstate-webhook-866bcb46dc-pl898\" (UID: \"08ae32ff-43fc-4536-b68a-45e4fd947a2d\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pl898" Feb 17 09:16:02 crc kubenswrapper[4848]: I0217 09:16:02.969537 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-6fjkq"] Feb 17 09:16:02 crc kubenswrapper[4848]: I0217 09:16:02.970185 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6fjkq" Feb 17 09:16:02 crc kubenswrapper[4848]: I0217 09:16:02.978848 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-pl898"] Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.063833 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4fpn\" (UniqueName: \"kubernetes.io/projected/08ae32ff-43fc-4536-b68a-45e4fd947a2d-kube-api-access-c4fpn\") pod \"nmstate-webhook-866bcb46dc-pl898\" (UID: \"08ae32ff-43fc-4536-b68a-45e4fd947a2d\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pl898" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.063885 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/473852b6-e35d-4b9f-8b47-e55ccb774b93-nmstate-lock\") pod \"nmstate-handler-6fjkq\" (UID: \"473852b6-e35d-4b9f-8b47-e55ccb774b93\") " pod="openshift-nmstate/nmstate-handler-6fjkq" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.063919 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9pxb\" (UniqueName: \"kubernetes.io/projected/989f6a1e-38ab-40a8-94aa-faadc620efca-kube-api-access-s9pxb\") pod \"nmstate-metrics-58c85c668d-7txp2\" (UID: \"989f6a1e-38ab-40a8-94aa-faadc620efca\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-7txp2" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.063986 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/08ae32ff-43fc-4536-b68a-45e4fd947a2d-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-pl898\" (UID: \"08ae32ff-43fc-4536-b68a-45e4fd947a2d\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pl898" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.064024 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/473852b6-e35d-4b9f-8b47-e55ccb774b93-dbus-socket\") pod \"nmstate-handler-6fjkq\" (UID: \"473852b6-e35d-4b9f-8b47-e55ccb774b93\") " pod="openshift-nmstate/nmstate-handler-6fjkq" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.064058 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/473852b6-e35d-4b9f-8b47-e55ccb774b93-ovs-socket\") pod \"nmstate-handler-6fjkq\" (UID: \"473852b6-e35d-4b9f-8b47-e55ccb774b93\") " pod="openshift-nmstate/nmstate-handler-6fjkq" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.064085 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pfc4\" (UniqueName: \"kubernetes.io/projected/473852b6-e35d-4b9f-8b47-e55ccb774b93-kube-api-access-2pfc4\") pod \"nmstate-handler-6fjkq\" (UID: \"473852b6-e35d-4b9f-8b47-e55ccb774b93\") " pod="openshift-nmstate/nmstate-handler-6fjkq" Feb 17 09:16:03 crc kubenswrapper[4848]: E0217 09:16:03.064154 4848 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Feb 17 09:16:03 crc kubenswrapper[4848]: E0217 09:16:03.064231 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/08ae32ff-43fc-4536-b68a-45e4fd947a2d-tls-key-pair podName:08ae32ff-43fc-4536-b68a-45e4fd947a2d nodeName:}" failed. No retries permitted until 2026-02-17 09:16:03.564209741 +0000 UTC m=+641.107465387 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/08ae32ff-43fc-4536-b68a-45e4fd947a2d-tls-key-pair") pod "nmstate-webhook-866bcb46dc-pl898" (UID: "08ae32ff-43fc-4536-b68a-45e4fd947a2d") : secret "openshift-nmstate-webhook" not found Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.071546 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x4n7r"] Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.072321 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x4n7r" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.079419 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.079710 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-pdxlh" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.079979 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.083165 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x4n7r"] Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.093917 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4fpn\" (UniqueName: \"kubernetes.io/projected/08ae32ff-43fc-4536-b68a-45e4fd947a2d-kube-api-access-c4fpn\") pod \"nmstate-webhook-866bcb46dc-pl898\" (UID: \"08ae32ff-43fc-4536-b68a-45e4fd947a2d\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pl898" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.103023 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9pxb\" (UniqueName: \"kubernetes.io/projected/989f6a1e-38ab-40a8-94aa-faadc620efca-kube-api-access-s9pxb\") pod \"nmstate-metrics-58c85c668d-7txp2\" (UID: \"989f6a1e-38ab-40a8-94aa-faadc620efca\") " pod="openshift-nmstate/nmstate-metrics-58c85c668d-7txp2" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.165353 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/473852b6-e35d-4b9f-8b47-e55ccb774b93-dbus-socket\") pod \"nmstate-handler-6fjkq\" (UID: \"473852b6-e35d-4b9f-8b47-e55ccb774b93\") " pod="openshift-nmstate/nmstate-handler-6fjkq" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.165784 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/413a0360-d8d4-427d-adbc-3d7914e54ea5-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-x4n7r\" (UID: \"413a0360-d8d4-427d-adbc-3d7914e54ea5\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x4n7r" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.165691 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/473852b6-e35d-4b9f-8b47-e55ccb774b93-dbus-socket\") pod \"nmstate-handler-6fjkq\" (UID: \"473852b6-e35d-4b9f-8b47-e55ccb774b93\") " pod="openshift-nmstate/nmstate-handler-6fjkq" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.165874 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/473852b6-e35d-4b9f-8b47-e55ccb774b93-ovs-socket\") pod \"nmstate-handler-6fjkq\" (UID: \"473852b6-e35d-4b9f-8b47-e55ccb774b93\") " pod="openshift-nmstate/nmstate-handler-6fjkq" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.165909 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pfc4\" (UniqueName: \"kubernetes.io/projected/473852b6-e35d-4b9f-8b47-e55ccb774b93-kube-api-access-2pfc4\") pod \"nmstate-handler-6fjkq\" (UID: \"473852b6-e35d-4b9f-8b47-e55ccb774b93\") " pod="openshift-nmstate/nmstate-handler-6fjkq" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.165940 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/473852b6-e35d-4b9f-8b47-e55ccb774b93-ovs-socket\") pod \"nmstate-handler-6fjkq\" (UID: \"473852b6-e35d-4b9f-8b47-e55ccb774b93\") " pod="openshift-nmstate/nmstate-handler-6fjkq" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.165969 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpw8q\" (UniqueName: \"kubernetes.io/projected/413a0360-d8d4-427d-adbc-3d7914e54ea5-kube-api-access-kpw8q\") pod \"nmstate-console-plugin-5c78fc5d65-x4n7r\" (UID: \"413a0360-d8d4-427d-adbc-3d7914e54ea5\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x4n7r" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.166075 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/473852b6-e35d-4b9f-8b47-e55ccb774b93-nmstate-lock\") pod \"nmstate-handler-6fjkq\" (UID: \"473852b6-e35d-4b9f-8b47-e55ccb774b93\") " pod="openshift-nmstate/nmstate-handler-6fjkq" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.166147 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/413a0360-d8d4-427d-adbc-3d7914e54ea5-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-x4n7r\" (UID: \"413a0360-d8d4-427d-adbc-3d7914e54ea5\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x4n7r" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.166204 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/473852b6-e35d-4b9f-8b47-e55ccb774b93-nmstate-lock\") pod \"nmstate-handler-6fjkq\" (UID: \"473852b6-e35d-4b9f-8b47-e55ccb774b93\") " pod="openshift-nmstate/nmstate-handler-6fjkq" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.183538 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pfc4\" (UniqueName: \"kubernetes.io/projected/473852b6-e35d-4b9f-8b47-e55ccb774b93-kube-api-access-2pfc4\") pod \"nmstate-handler-6fjkq\" (UID: \"473852b6-e35d-4b9f-8b47-e55ccb774b93\") " pod="openshift-nmstate/nmstate-handler-6fjkq" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.268271 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6588db564f-mt5h5"] Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.268425 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58c85c668d-7txp2" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.269526 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/413a0360-d8d4-427d-adbc-3d7914e54ea5-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-x4n7r\" (UID: \"413a0360-d8d4-427d-adbc-3d7914e54ea5\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x4n7r" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.269617 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpw8q\" (UniqueName: \"kubernetes.io/projected/413a0360-d8d4-427d-adbc-3d7914e54ea5-kube-api-access-kpw8q\") pod \"nmstate-console-plugin-5c78fc5d65-x4n7r\" (UID: \"413a0360-d8d4-427d-adbc-3d7914e54ea5\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x4n7r" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.269666 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/413a0360-d8d4-427d-adbc-3d7914e54ea5-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-x4n7r\" (UID: \"413a0360-d8d4-427d-adbc-3d7914e54ea5\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x4n7r" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.269735 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:03 crc kubenswrapper[4848]: E0217 09:16:03.269937 4848 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Feb 17 09:16:03 crc kubenswrapper[4848]: E0217 09:16:03.270014 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/413a0360-d8d4-427d-adbc-3d7914e54ea5-plugin-serving-cert podName:413a0360-d8d4-427d-adbc-3d7914e54ea5 nodeName:}" failed. No retries permitted until 2026-02-17 09:16:03.76998594 +0000 UTC m=+641.313241596 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/413a0360-d8d4-427d-adbc-3d7914e54ea5-plugin-serving-cert") pod "nmstate-console-plugin-5c78fc5d65-x4n7r" (UID: "413a0360-d8d4-427d-adbc-3d7914e54ea5") : secret "plugin-serving-cert" not found Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.271489 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/413a0360-d8d4-427d-adbc-3d7914e54ea5-nginx-conf\") pod \"nmstate-console-plugin-5c78fc5d65-x4n7r\" (UID: \"413a0360-d8d4-427d-adbc-3d7914e54ea5\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x4n7r" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.276750 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6588db564f-mt5h5"] Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.294235 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-6fjkq" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.303279 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpw8q\" (UniqueName: \"kubernetes.io/projected/413a0360-d8d4-427d-adbc-3d7914e54ea5-kube-api-access-kpw8q\") pod \"nmstate-console-plugin-5c78fc5d65-x4n7r\" (UID: \"413a0360-d8d4-427d-adbc-3d7914e54ea5\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x4n7r" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.371911 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c8c0fe56-3173-4ba9-878a-b6f5876f13c2-console-oauth-config\") pod \"console-6588db564f-mt5h5\" (UID: \"c8c0fe56-3173-4ba9-878a-b6f5876f13c2\") " pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.371949 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk9lz\" (UniqueName: \"kubernetes.io/projected/c8c0fe56-3173-4ba9-878a-b6f5876f13c2-kube-api-access-fk9lz\") pod \"console-6588db564f-mt5h5\" (UID: \"c8c0fe56-3173-4ba9-878a-b6f5876f13c2\") " pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.371975 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c8c0fe56-3173-4ba9-878a-b6f5876f13c2-console-serving-cert\") pod \"console-6588db564f-mt5h5\" (UID: \"c8c0fe56-3173-4ba9-878a-b6f5876f13c2\") " pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.372093 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c8c0fe56-3173-4ba9-878a-b6f5876f13c2-oauth-serving-cert\") pod \"console-6588db564f-mt5h5\" (UID: \"c8c0fe56-3173-4ba9-878a-b6f5876f13c2\") " pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.372117 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c8c0fe56-3173-4ba9-878a-b6f5876f13c2-service-ca\") pod \"console-6588db564f-mt5h5\" (UID: \"c8c0fe56-3173-4ba9-878a-b6f5876f13c2\") " pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.372141 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c8c0fe56-3173-4ba9-878a-b6f5876f13c2-console-config\") pod \"console-6588db564f-mt5h5\" (UID: \"c8c0fe56-3173-4ba9-878a-b6f5876f13c2\") " pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.372159 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c0fe56-3173-4ba9-878a-b6f5876f13c2-trusted-ca-bundle\") pod \"console-6588db564f-mt5h5\" (UID: \"c8c0fe56-3173-4ba9-878a-b6f5876f13c2\") " pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.472671 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c8c0fe56-3173-4ba9-878a-b6f5876f13c2-console-oauth-config\") pod \"console-6588db564f-mt5h5\" (UID: \"c8c0fe56-3173-4ba9-878a-b6f5876f13c2\") " pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.473031 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk9lz\" (UniqueName: \"kubernetes.io/projected/c8c0fe56-3173-4ba9-878a-b6f5876f13c2-kube-api-access-fk9lz\") pod \"console-6588db564f-mt5h5\" (UID: \"c8c0fe56-3173-4ba9-878a-b6f5876f13c2\") " pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.473053 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c8c0fe56-3173-4ba9-878a-b6f5876f13c2-console-serving-cert\") pod \"console-6588db564f-mt5h5\" (UID: \"c8c0fe56-3173-4ba9-878a-b6f5876f13c2\") " pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.473073 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c8c0fe56-3173-4ba9-878a-b6f5876f13c2-oauth-serving-cert\") pod \"console-6588db564f-mt5h5\" (UID: \"c8c0fe56-3173-4ba9-878a-b6f5876f13c2\") " pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.473089 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c8c0fe56-3173-4ba9-878a-b6f5876f13c2-service-ca\") pod \"console-6588db564f-mt5h5\" (UID: \"c8c0fe56-3173-4ba9-878a-b6f5876f13c2\") " pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.473117 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c8c0fe56-3173-4ba9-878a-b6f5876f13c2-console-config\") pod \"console-6588db564f-mt5h5\" (UID: \"c8c0fe56-3173-4ba9-878a-b6f5876f13c2\") " pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.473143 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c0fe56-3173-4ba9-878a-b6f5876f13c2-trusted-ca-bundle\") pod \"console-6588db564f-mt5h5\" (UID: \"c8c0fe56-3173-4ba9-878a-b6f5876f13c2\") " pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.473479 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58c85c668d-7txp2"] Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.474214 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c8c0fe56-3173-4ba9-878a-b6f5876f13c2-trusted-ca-bundle\") pod \"console-6588db564f-mt5h5\" (UID: \"c8c0fe56-3173-4ba9-878a-b6f5876f13c2\") " pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.474401 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c8c0fe56-3173-4ba9-878a-b6f5876f13c2-console-config\") pod \"console-6588db564f-mt5h5\" (UID: \"c8c0fe56-3173-4ba9-878a-b6f5876f13c2\") " pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.474421 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c8c0fe56-3173-4ba9-878a-b6f5876f13c2-service-ca\") pod \"console-6588db564f-mt5h5\" (UID: \"c8c0fe56-3173-4ba9-878a-b6f5876f13c2\") " pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.474750 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c8c0fe56-3173-4ba9-878a-b6f5876f13c2-oauth-serving-cert\") pod \"console-6588db564f-mt5h5\" (UID: \"c8c0fe56-3173-4ba9-878a-b6f5876f13c2\") " pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.478455 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c8c0fe56-3173-4ba9-878a-b6f5876f13c2-console-serving-cert\") pod \"console-6588db564f-mt5h5\" (UID: \"c8c0fe56-3173-4ba9-878a-b6f5876f13c2\") " pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.480306 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c8c0fe56-3173-4ba9-878a-b6f5876f13c2-console-oauth-config\") pod \"console-6588db564f-mt5h5\" (UID: \"c8c0fe56-3173-4ba9-878a-b6f5876f13c2\") " pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:03 crc kubenswrapper[4848]: W0217 09:16:03.483057 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod989f6a1e_38ab_40a8_94aa_faadc620efca.slice/crio-5cf0c98eb76071e7d6c7d323565385889e7ca821b36c2240dbcad77f6143ee6a WatchSource:0}: Error finding container 5cf0c98eb76071e7d6c7d323565385889e7ca821b36c2240dbcad77f6143ee6a: Status 404 returned error can't find the container with id 5cf0c98eb76071e7d6c7d323565385889e7ca821b36c2240dbcad77f6143ee6a Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.489212 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk9lz\" (UniqueName: \"kubernetes.io/projected/c8c0fe56-3173-4ba9-878a-b6f5876f13c2-kube-api-access-fk9lz\") pod \"console-6588db564f-mt5h5\" (UID: \"c8c0fe56-3173-4ba9-878a-b6f5876f13c2\") " pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.573901 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/08ae32ff-43fc-4536-b68a-45e4fd947a2d-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-pl898\" (UID: \"08ae32ff-43fc-4536-b68a-45e4fd947a2d\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pl898" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.576862 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/08ae32ff-43fc-4536-b68a-45e4fd947a2d-tls-key-pair\") pod \"nmstate-webhook-866bcb46dc-pl898\" (UID: \"08ae32ff-43fc-4536-b68a-45e4fd947a2d\") " pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pl898" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.638427 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.776007 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/413a0360-d8d4-427d-adbc-3d7914e54ea5-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-x4n7r\" (UID: \"413a0360-d8d4-427d-adbc-3d7914e54ea5\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x4n7r" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.781140 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/413a0360-d8d4-427d-adbc-3d7914e54ea5-plugin-serving-cert\") pod \"nmstate-console-plugin-5c78fc5d65-x4n7r\" (UID: \"413a0360-d8d4-427d-adbc-3d7914e54ea5\") " pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x4n7r" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.879076 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pl898" Feb 17 09:16:03 crc kubenswrapper[4848]: I0217 09:16:03.892205 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6588db564f-mt5h5"] Feb 17 09:16:04 crc kubenswrapper[4848]: I0217 09:16:04.024891 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x4n7r" Feb 17 09:16:04 crc kubenswrapper[4848]: I0217 09:16:04.087696 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6588db564f-mt5h5" event={"ID":"c8c0fe56-3173-4ba9-878a-b6f5876f13c2","Type":"ContainerStarted","Data":"58255f3e31cfe1598b94070db6dfcda4e43a60bb62c89ceb1203422cec13d5b5"} Feb 17 09:16:04 crc kubenswrapper[4848]: I0217 09:16:04.087798 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6588db564f-mt5h5" event={"ID":"c8c0fe56-3173-4ba9-878a-b6f5876f13c2","Type":"ContainerStarted","Data":"715d873e6998c1af306b6a4762417ea8e01aef583787735eceedc013d50e9c2f"} Feb 17 09:16:04 crc kubenswrapper[4848]: I0217 09:16:04.088959 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-7txp2" event={"ID":"989f6a1e-38ab-40a8-94aa-faadc620efca","Type":"ContainerStarted","Data":"5cf0c98eb76071e7d6c7d323565385889e7ca821b36c2240dbcad77f6143ee6a"} Feb 17 09:16:04 crc kubenswrapper[4848]: I0217 09:16:04.090205 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6fjkq" event={"ID":"473852b6-e35d-4b9f-8b47-e55ccb774b93","Type":"ContainerStarted","Data":"9c1dbbb6e0303957fc49fed8ca7b8f70adee513e0bf494968ab00578af0dc8e6"} Feb 17 09:16:04 crc kubenswrapper[4848]: I0217 09:16:04.116512 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-866bcb46dc-pl898"] Feb 17 09:16:04 crc kubenswrapper[4848]: I0217 09:16:04.431224 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6588db564f-mt5h5" podStartSLOduration=1.431209087 podStartE2EDuration="1.431209087s" podCreationTimestamp="2026-02-17 09:16:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:16:04.123021726 +0000 UTC m=+641.666277372" watchObservedRunningTime="2026-02-17 09:16:04.431209087 +0000 UTC m=+641.974464733" Feb 17 09:16:04 crc kubenswrapper[4848]: I0217 09:16:04.431933 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x4n7r"] Feb 17 09:16:04 crc kubenswrapper[4848]: W0217 09:16:04.439223 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod413a0360_d8d4_427d_adbc_3d7914e54ea5.slice/crio-aec524cc30f59182941cc0934b31c25dc52b77980acfc9e82c8fed224c6280a6 WatchSource:0}: Error finding container aec524cc30f59182941cc0934b31c25dc52b77980acfc9e82c8fed224c6280a6: Status 404 returned error can't find the container with id aec524cc30f59182941cc0934b31c25dc52b77980acfc9e82c8fed224c6280a6 Feb 17 09:16:05 crc kubenswrapper[4848]: I0217 09:16:05.095049 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pl898" event={"ID":"08ae32ff-43fc-4536-b68a-45e4fd947a2d","Type":"ContainerStarted","Data":"4fce60ebd6850b41cff0be62b14bdcf9dd66fbabd8993c061d31ab0776625917"} Feb 17 09:16:05 crc kubenswrapper[4848]: I0217 09:16:05.096198 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x4n7r" event={"ID":"413a0360-d8d4-427d-adbc-3d7914e54ea5","Type":"ContainerStarted","Data":"aec524cc30f59182941cc0934b31c25dc52b77980acfc9e82c8fed224c6280a6"} Feb 17 09:16:06 crc kubenswrapper[4848]: I0217 09:16:06.105117 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-6fjkq" event={"ID":"473852b6-e35d-4b9f-8b47-e55ccb774b93","Type":"ContainerStarted","Data":"3b5e2bceff8eff6ff9b67ad992e4825c46ed8c2485e35e939325fd1b9a65b47e"} Feb 17 09:16:06 crc kubenswrapper[4848]: I0217 09:16:06.105474 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-6fjkq" Feb 17 09:16:06 crc kubenswrapper[4848]: I0217 09:16:06.108294 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-7txp2" event={"ID":"989f6a1e-38ab-40a8-94aa-faadc620efca","Type":"ContainerStarted","Data":"b336b59465363d47975342129183b3854015f40ee9e3fc67697d68e618a5ad00"} Feb 17 09:16:06 crc kubenswrapper[4848]: I0217 09:16:06.110418 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pl898" event={"ID":"08ae32ff-43fc-4536-b68a-45e4fd947a2d","Type":"ContainerStarted","Data":"e77d4d7a93b63b2edfb825393f318da53115f46c846f3a74945d66069b63531e"} Feb 17 09:16:06 crc kubenswrapper[4848]: I0217 09:16:06.110945 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pl898" Feb 17 09:16:06 crc kubenswrapper[4848]: I0217 09:16:06.124500 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-6fjkq" podStartSLOduration=1.809689678 podStartE2EDuration="4.124477252s" podCreationTimestamp="2026-02-17 09:16:02 +0000 UTC" firstStartedPulling="2026-02-17 09:16:03.328328595 +0000 UTC m=+640.871584231" lastFinishedPulling="2026-02-17 09:16:05.643116149 +0000 UTC m=+643.186371805" observedRunningTime="2026-02-17 09:16:06.120967811 +0000 UTC m=+643.664223467" watchObservedRunningTime="2026-02-17 09:16:06.124477252 +0000 UTC m=+643.667732898" Feb 17 09:16:06 crc kubenswrapper[4848]: I0217 09:16:06.142024 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pl898" podStartSLOduration=2.626811034 podStartE2EDuration="4.141992105s" podCreationTimestamp="2026-02-17 09:16:02 +0000 UTC" firstStartedPulling="2026-02-17 09:16:04.133698123 +0000 UTC m=+641.676953769" lastFinishedPulling="2026-02-17 09:16:05.648879194 +0000 UTC m=+643.192134840" observedRunningTime="2026-02-17 09:16:06.135605142 +0000 UTC m=+643.678860798" watchObservedRunningTime="2026-02-17 09:16:06.141992105 +0000 UTC m=+643.685247781" Feb 17 09:16:07 crc kubenswrapper[4848]: I0217 09:16:07.131429 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x4n7r" event={"ID":"413a0360-d8d4-427d-adbc-3d7914e54ea5","Type":"ContainerStarted","Data":"798188aba9c45006e612e69e2cd9fd28a9cbf74303733c84029631d8913a5cda"} Feb 17 09:16:07 crc kubenswrapper[4848]: I0217 09:16:07.148772 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5c78fc5d65-x4n7r" podStartSLOduration=2.033282316 podStartE2EDuration="4.14858081s" podCreationTimestamp="2026-02-17 09:16:03 +0000 UTC" firstStartedPulling="2026-02-17 09:16:04.441943795 +0000 UTC m=+641.985199461" lastFinishedPulling="2026-02-17 09:16:06.557242299 +0000 UTC m=+644.100497955" observedRunningTime="2026-02-17 09:16:07.14752115 +0000 UTC m=+644.690776816" watchObservedRunningTime="2026-02-17 09:16:07.14858081 +0000 UTC m=+644.691836476" Feb 17 09:16:08 crc kubenswrapper[4848]: I0217 09:16:08.144146 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58c85c668d-7txp2" event={"ID":"989f6a1e-38ab-40a8-94aa-faadc620efca","Type":"ContainerStarted","Data":"9a136cd9d6cbec8e93b9ff58ef05d010686a20e0d98da76ef972e9c4b8c662d4"} Feb 17 09:16:08 crc kubenswrapper[4848]: I0217 09:16:08.173240 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58c85c668d-7txp2" podStartSLOduration=1.810829112 podStartE2EDuration="6.173225005s" podCreationTimestamp="2026-02-17 09:16:02 +0000 UTC" firstStartedPulling="2026-02-17 09:16:03.485284213 +0000 UTC m=+641.028539859" lastFinishedPulling="2026-02-17 09:16:07.847680096 +0000 UTC m=+645.390935752" observedRunningTime="2026-02-17 09:16:08.16885412 +0000 UTC m=+645.712109806" watchObservedRunningTime="2026-02-17 09:16:08.173225005 +0000 UTC m=+645.716480651" Feb 17 09:16:13 crc kubenswrapper[4848]: I0217 09:16:13.327754 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-6fjkq" Feb 17 09:16:13 crc kubenswrapper[4848]: I0217 09:16:13.638601 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:13 crc kubenswrapper[4848]: I0217 09:16:13.639162 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:13 crc kubenswrapper[4848]: I0217 09:16:13.647415 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:14 crc kubenswrapper[4848]: I0217 09:16:14.193304 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6588db564f-mt5h5" Feb 17 09:16:14 crc kubenswrapper[4848]: I0217 09:16:14.258998 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-9h2hf"] Feb 17 09:16:23 crc kubenswrapper[4848]: I0217 09:16:23.887575 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-866bcb46dc-pl898" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.325944 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-9h2hf" podUID="a9b13597-8879-40d4-965b-580222915295" containerName="console" containerID="cri-o://8683e61423d6b16620ac250b3a8d71ad7c4397cb4fc92da46d646f88262cc57c" gracePeriod=15 Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.742891 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-9h2hf_a9b13597-8879-40d4-965b-580222915295/console/0.log" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.743146 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.755118 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb"] Feb 17 09:16:39 crc kubenswrapper[4848]: E0217 09:16:39.755370 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9b13597-8879-40d4-965b-580222915295" containerName="console" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.755389 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9b13597-8879-40d4-965b-580222915295" containerName="console" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.755488 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9b13597-8879-40d4-965b-580222915295" containerName="console" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.756287 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.757968 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.764900 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb"] Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.818512 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9b13597-8879-40d4-965b-580222915295-oauth-serving-cert\") pod \"a9b13597-8879-40d4-965b-580222915295\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.818546 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9b13597-8879-40d4-965b-580222915295-console-serving-cert\") pod \"a9b13597-8879-40d4-965b-580222915295\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.818564 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9b13597-8879-40d4-965b-580222915295-trusted-ca-bundle\") pod \"a9b13597-8879-40d4-965b-580222915295\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.818610 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9b13597-8879-40d4-965b-580222915295-service-ca\") pod \"a9b13597-8879-40d4-965b-580222915295\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.818631 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zzg7\" (UniqueName: \"kubernetes.io/projected/a9b13597-8879-40d4-965b-580222915295-kube-api-access-7zzg7\") pod \"a9b13597-8879-40d4-965b-580222915295\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.818644 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9b13597-8879-40d4-965b-580222915295-console-config\") pod \"a9b13597-8879-40d4-965b-580222915295\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.818688 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9b13597-8879-40d4-965b-580222915295-console-oauth-config\") pod \"a9b13597-8879-40d4-965b-580222915295\" (UID: \"a9b13597-8879-40d4-965b-580222915295\") " Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.818792 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24a91401-07a6-41bf-ad5b-fa2b8f60a52f-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb\" (UID: \"24a91401-07a6-41bf-ad5b-fa2b8f60a52f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.818816 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24a91401-07a6-41bf-ad5b-fa2b8f60a52f-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb\" (UID: \"24a91401-07a6-41bf-ad5b-fa2b8f60a52f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.818848 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng5xk\" (UniqueName: \"kubernetes.io/projected/24a91401-07a6-41bf-ad5b-fa2b8f60a52f-kube-api-access-ng5xk\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb\" (UID: \"24a91401-07a6-41bf-ad5b-fa2b8f60a52f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.819546 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9b13597-8879-40d4-965b-580222915295-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a9b13597-8879-40d4-965b-580222915295" (UID: "a9b13597-8879-40d4-965b-580222915295"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.819923 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9b13597-8879-40d4-965b-580222915295-console-config" (OuterVolumeSpecName: "console-config") pod "a9b13597-8879-40d4-965b-580222915295" (UID: "a9b13597-8879-40d4-965b-580222915295"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.820670 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9b13597-8879-40d4-965b-580222915295-service-ca" (OuterVolumeSpecName: "service-ca") pod "a9b13597-8879-40d4-965b-580222915295" (UID: "a9b13597-8879-40d4-965b-580222915295"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.820732 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9b13597-8879-40d4-965b-580222915295-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a9b13597-8879-40d4-965b-580222915295" (UID: "a9b13597-8879-40d4-965b-580222915295"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.825021 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9b13597-8879-40d4-965b-580222915295-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a9b13597-8879-40d4-965b-580222915295" (UID: "a9b13597-8879-40d4-965b-580222915295"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.825800 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9b13597-8879-40d4-965b-580222915295-kube-api-access-7zzg7" (OuterVolumeSpecName: "kube-api-access-7zzg7") pod "a9b13597-8879-40d4-965b-580222915295" (UID: "a9b13597-8879-40d4-965b-580222915295"). InnerVolumeSpecName "kube-api-access-7zzg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.830784 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9b13597-8879-40d4-965b-580222915295-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a9b13597-8879-40d4-965b-580222915295" (UID: "a9b13597-8879-40d4-965b-580222915295"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.919933 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24a91401-07a6-41bf-ad5b-fa2b8f60a52f-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb\" (UID: \"24a91401-07a6-41bf-ad5b-fa2b8f60a52f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.919983 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24a91401-07a6-41bf-ad5b-fa2b8f60a52f-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb\" (UID: \"24a91401-07a6-41bf-ad5b-fa2b8f60a52f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.920030 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng5xk\" (UniqueName: \"kubernetes.io/projected/24a91401-07a6-41bf-ad5b-fa2b8f60a52f-kube-api-access-ng5xk\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb\" (UID: \"24a91401-07a6-41bf-ad5b-fa2b8f60a52f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.921241 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24a91401-07a6-41bf-ad5b-fa2b8f60a52f-bundle\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb\" (UID: \"24a91401-07a6-41bf-ad5b-fa2b8f60a52f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.921367 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24a91401-07a6-41bf-ad5b-fa2b8f60a52f-util\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb\" (UID: \"24a91401-07a6-41bf-ad5b-fa2b8f60a52f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.931325 4848 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a9b13597-8879-40d4-965b-580222915295-service-ca\") on node \"crc\" DevicePath \"\"" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.931377 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zzg7\" (UniqueName: \"kubernetes.io/projected/a9b13597-8879-40d4-965b-580222915295-kube-api-access-7zzg7\") on node \"crc\" DevicePath \"\"" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.931392 4848 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a9b13597-8879-40d4-965b-580222915295-console-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.931412 4848 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a9b13597-8879-40d4-965b-580222915295-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.931424 4848 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a9b13597-8879-40d4-965b-580222915295-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.931436 4848 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a9b13597-8879-40d4-965b-580222915295-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.931448 4848 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9b13597-8879-40d4-965b-580222915295-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:16:39 crc kubenswrapper[4848]: I0217 09:16:39.945952 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng5xk\" (UniqueName: \"kubernetes.io/projected/24a91401-07a6-41bf-ad5b-fa2b8f60a52f-kube-api-access-ng5xk\") pod \"a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb\" (UID: \"24a91401-07a6-41bf-ad5b-fa2b8f60a52f\") " pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb" Feb 17 09:16:40 crc kubenswrapper[4848]: I0217 09:16:40.077526 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb" Feb 17 09:16:40 crc kubenswrapper[4848]: I0217 09:16:40.308153 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb"] Feb 17 09:16:40 crc kubenswrapper[4848]: W0217 09:16:40.322037 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod24a91401_07a6_41bf_ad5b_fa2b8f60a52f.slice/crio-9cd7c0e3faadeaeaa0f642ca871e4e0f893c18023f0d12b02132f559fb1dc10f WatchSource:0}: Error finding container 9cd7c0e3faadeaeaa0f642ca871e4e0f893c18023f0d12b02132f559fb1dc10f: Status 404 returned error can't find the container with id 9cd7c0e3faadeaeaa0f642ca871e4e0f893c18023f0d12b02132f559fb1dc10f Feb 17 09:16:40 crc kubenswrapper[4848]: I0217 09:16:40.374800 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb" event={"ID":"24a91401-07a6-41bf-ad5b-fa2b8f60a52f","Type":"ContainerStarted","Data":"9cd7c0e3faadeaeaa0f642ca871e4e0f893c18023f0d12b02132f559fb1dc10f"} Feb 17 09:16:40 crc kubenswrapper[4848]: I0217 09:16:40.377626 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-9h2hf_a9b13597-8879-40d4-965b-580222915295/console/0.log" Feb 17 09:16:40 crc kubenswrapper[4848]: I0217 09:16:40.377727 4848 generic.go:334] "Generic (PLEG): container finished" podID="a9b13597-8879-40d4-965b-580222915295" containerID="8683e61423d6b16620ac250b3a8d71ad7c4397cb4fc92da46d646f88262cc57c" exitCode=2 Feb 17 09:16:40 crc kubenswrapper[4848]: I0217 09:16:40.377829 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9h2hf" event={"ID":"a9b13597-8879-40d4-965b-580222915295","Type":"ContainerDied","Data":"8683e61423d6b16620ac250b3a8d71ad7c4397cb4fc92da46d646f88262cc57c"} Feb 17 09:16:40 crc kubenswrapper[4848]: I0217 09:16:40.377893 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9h2hf" Feb 17 09:16:40 crc kubenswrapper[4848]: I0217 09:16:40.377927 4848 scope.go:117] "RemoveContainer" containerID="8683e61423d6b16620ac250b3a8d71ad7c4397cb4fc92da46d646f88262cc57c" Feb 17 09:16:40 crc kubenswrapper[4848]: I0217 09:16:40.377902 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9h2hf" event={"ID":"a9b13597-8879-40d4-965b-580222915295","Type":"ContainerDied","Data":"6c65c190555b1c38fa194ca3c16b1f1f6758c38f55327734a6bec84c9e3ca899"} Feb 17 09:16:40 crc kubenswrapper[4848]: I0217 09:16:40.401563 4848 scope.go:117] "RemoveContainer" containerID="8683e61423d6b16620ac250b3a8d71ad7c4397cb4fc92da46d646f88262cc57c" Feb 17 09:16:40 crc kubenswrapper[4848]: E0217 09:16:40.402944 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8683e61423d6b16620ac250b3a8d71ad7c4397cb4fc92da46d646f88262cc57c\": container with ID starting with 8683e61423d6b16620ac250b3a8d71ad7c4397cb4fc92da46d646f88262cc57c not found: ID does not exist" containerID="8683e61423d6b16620ac250b3a8d71ad7c4397cb4fc92da46d646f88262cc57c" Feb 17 09:16:40 crc kubenswrapper[4848]: I0217 09:16:40.402991 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8683e61423d6b16620ac250b3a8d71ad7c4397cb4fc92da46d646f88262cc57c"} err="failed to get container status \"8683e61423d6b16620ac250b3a8d71ad7c4397cb4fc92da46d646f88262cc57c\": rpc error: code = NotFound desc = could not find container \"8683e61423d6b16620ac250b3a8d71ad7c4397cb4fc92da46d646f88262cc57c\": container with ID starting with 8683e61423d6b16620ac250b3a8d71ad7c4397cb4fc92da46d646f88262cc57c not found: ID does not exist" Feb 17 09:16:40 crc kubenswrapper[4848]: I0217 09:16:40.438819 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-9h2hf"] Feb 17 09:16:40 crc kubenswrapper[4848]: I0217 09:16:40.442336 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-9h2hf"] Feb 17 09:16:41 crc kubenswrapper[4848]: I0217 09:16:41.391081 4848 generic.go:334] "Generic (PLEG): container finished" podID="24a91401-07a6-41bf-ad5b-fa2b8f60a52f" containerID="d159f33b9cf54bad3924900006843fea312280d86e2b38a706e73eada7e70cf4" exitCode=0 Feb 17 09:16:41 crc kubenswrapper[4848]: I0217 09:16:41.400734 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9b13597-8879-40d4-965b-580222915295" path="/var/lib/kubelet/pods/a9b13597-8879-40d4-965b-580222915295/volumes" Feb 17 09:16:41 crc kubenswrapper[4848]: I0217 09:16:41.401519 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb" event={"ID":"24a91401-07a6-41bf-ad5b-fa2b8f60a52f","Type":"ContainerDied","Data":"d159f33b9cf54bad3924900006843fea312280d86e2b38a706e73eada7e70cf4"} Feb 17 09:16:43 crc kubenswrapper[4848]: I0217 09:16:43.410586 4848 generic.go:334] "Generic (PLEG): container finished" podID="24a91401-07a6-41bf-ad5b-fa2b8f60a52f" containerID="217ce2c42e47a2f0a3f4ee5def1d2bd7e5aaf9a17e539ce0cdb94362fda14246" exitCode=0 Feb 17 09:16:43 crc kubenswrapper[4848]: I0217 09:16:43.410662 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb" event={"ID":"24a91401-07a6-41bf-ad5b-fa2b8f60a52f","Type":"ContainerDied","Data":"217ce2c42e47a2f0a3f4ee5def1d2bd7e5aaf9a17e539ce0cdb94362fda14246"} Feb 17 09:16:44 crc kubenswrapper[4848]: I0217 09:16:44.422490 4848 generic.go:334] "Generic (PLEG): container finished" podID="24a91401-07a6-41bf-ad5b-fa2b8f60a52f" containerID="b98037632323aad289ce99c716205a2be0265162a110da01023b6a2495f71527" exitCode=0 Feb 17 09:16:44 crc kubenswrapper[4848]: I0217 09:16:44.422626 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb" event={"ID":"24a91401-07a6-41bf-ad5b-fa2b8f60a52f","Type":"ContainerDied","Data":"b98037632323aad289ce99c716205a2be0265162a110da01023b6a2495f71527"} Feb 17 09:16:45 crc kubenswrapper[4848]: I0217 09:16:45.725191 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb" Feb 17 09:16:45 crc kubenswrapper[4848]: I0217 09:16:45.861530 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ng5xk\" (UniqueName: \"kubernetes.io/projected/24a91401-07a6-41bf-ad5b-fa2b8f60a52f-kube-api-access-ng5xk\") pod \"24a91401-07a6-41bf-ad5b-fa2b8f60a52f\" (UID: \"24a91401-07a6-41bf-ad5b-fa2b8f60a52f\") " Feb 17 09:16:45 crc kubenswrapper[4848]: I0217 09:16:45.861694 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24a91401-07a6-41bf-ad5b-fa2b8f60a52f-bundle\") pod \"24a91401-07a6-41bf-ad5b-fa2b8f60a52f\" (UID: \"24a91401-07a6-41bf-ad5b-fa2b8f60a52f\") " Feb 17 09:16:45 crc kubenswrapper[4848]: I0217 09:16:45.861751 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24a91401-07a6-41bf-ad5b-fa2b8f60a52f-util\") pod \"24a91401-07a6-41bf-ad5b-fa2b8f60a52f\" (UID: \"24a91401-07a6-41bf-ad5b-fa2b8f60a52f\") " Feb 17 09:16:45 crc kubenswrapper[4848]: I0217 09:16:45.863845 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24a91401-07a6-41bf-ad5b-fa2b8f60a52f-bundle" (OuterVolumeSpecName: "bundle") pod "24a91401-07a6-41bf-ad5b-fa2b8f60a52f" (UID: "24a91401-07a6-41bf-ad5b-fa2b8f60a52f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:16:45 crc kubenswrapper[4848]: I0217 09:16:45.871264 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24a91401-07a6-41bf-ad5b-fa2b8f60a52f-kube-api-access-ng5xk" (OuterVolumeSpecName: "kube-api-access-ng5xk") pod "24a91401-07a6-41bf-ad5b-fa2b8f60a52f" (UID: "24a91401-07a6-41bf-ad5b-fa2b8f60a52f"). InnerVolumeSpecName "kube-api-access-ng5xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:16:45 crc kubenswrapper[4848]: I0217 09:16:45.964338 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ng5xk\" (UniqueName: \"kubernetes.io/projected/24a91401-07a6-41bf-ad5b-fa2b8f60a52f-kube-api-access-ng5xk\") on node \"crc\" DevicePath \"\"" Feb 17 09:16:45 crc kubenswrapper[4848]: I0217 09:16:45.964399 4848 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/24a91401-07a6-41bf-ad5b-fa2b8f60a52f-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:16:46 crc kubenswrapper[4848]: I0217 09:16:46.147961 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24a91401-07a6-41bf-ad5b-fa2b8f60a52f-util" (OuterVolumeSpecName: "util") pod "24a91401-07a6-41bf-ad5b-fa2b8f60a52f" (UID: "24a91401-07a6-41bf-ad5b-fa2b8f60a52f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:16:46 crc kubenswrapper[4848]: I0217 09:16:46.167273 4848 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/24a91401-07a6-41bf-ad5b-fa2b8f60a52f-util\") on node \"crc\" DevicePath \"\"" Feb 17 09:16:46 crc kubenswrapper[4848]: I0217 09:16:46.439415 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb" event={"ID":"24a91401-07a6-41bf-ad5b-fa2b8f60a52f","Type":"ContainerDied","Data":"9cd7c0e3faadeaeaa0f642ca871e4e0f893c18023f0d12b02132f559fb1dc10f"} Feb 17 09:16:46 crc kubenswrapper[4848]: I0217 09:16:46.439474 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9cd7c0e3faadeaeaa0f642ca871e4e0f893c18023f0d12b02132f559fb1dc10f" Feb 17 09:16:46 crc kubenswrapper[4848]: I0217 09:16:46.439503 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.038216 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6df4786bd-895gn"] Feb 17 09:16:55 crc kubenswrapper[4848]: E0217 09:16:55.038861 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a91401-07a6-41bf-ad5b-fa2b8f60a52f" containerName="extract" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.038873 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a91401-07a6-41bf-ad5b-fa2b8f60a52f" containerName="extract" Feb 17 09:16:55 crc kubenswrapper[4848]: E0217 09:16:55.038883 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a91401-07a6-41bf-ad5b-fa2b8f60a52f" containerName="pull" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.038888 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a91401-07a6-41bf-ad5b-fa2b8f60a52f" containerName="pull" Feb 17 09:16:55 crc kubenswrapper[4848]: E0217 09:16:55.038900 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a91401-07a6-41bf-ad5b-fa2b8f60a52f" containerName="util" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.038907 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a91401-07a6-41bf-ad5b-fa2b8f60a52f" containerName="util" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.038995 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a91401-07a6-41bf-ad5b-fa2b8f60a52f" containerName="extract" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.039345 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6df4786bd-895gn" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.042273 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-fwddn" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.042846 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.043078 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.043274 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.043545 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.106813 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6df4786bd-895gn"] Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.193625 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/054a38ba-b80d-44df-b84a-e5e3b9847df3-webhook-cert\") pod \"metallb-operator-controller-manager-6df4786bd-895gn\" (UID: \"054a38ba-b80d-44df-b84a-e5e3b9847df3\") " pod="metallb-system/metallb-operator-controller-manager-6df4786bd-895gn" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.193744 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/054a38ba-b80d-44df-b84a-e5e3b9847df3-apiservice-cert\") pod \"metallb-operator-controller-manager-6df4786bd-895gn\" (UID: \"054a38ba-b80d-44df-b84a-e5e3b9847df3\") " pod="metallb-system/metallb-operator-controller-manager-6df4786bd-895gn" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.193807 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzmfj\" (UniqueName: \"kubernetes.io/projected/054a38ba-b80d-44df-b84a-e5e3b9847df3-kube-api-access-lzmfj\") pod \"metallb-operator-controller-manager-6df4786bd-895gn\" (UID: \"054a38ba-b80d-44df-b84a-e5e3b9847df3\") " pod="metallb-system/metallb-operator-controller-manager-6df4786bd-895gn" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.260715 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7f987958c8-pm7pp"] Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.261462 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7f987958c8-pm7pp" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.263380 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.263693 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.264107 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-4bhh9" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.285525 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7f987958c8-pm7pp"] Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.299317 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/054a38ba-b80d-44df-b84a-e5e3b9847df3-apiservice-cert\") pod \"metallb-operator-controller-manager-6df4786bd-895gn\" (UID: \"054a38ba-b80d-44df-b84a-e5e3b9847df3\") " pod="metallb-system/metallb-operator-controller-manager-6df4786bd-895gn" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.299368 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzmfj\" (UniqueName: \"kubernetes.io/projected/054a38ba-b80d-44df-b84a-e5e3b9847df3-kube-api-access-lzmfj\") pod \"metallb-operator-controller-manager-6df4786bd-895gn\" (UID: \"054a38ba-b80d-44df-b84a-e5e3b9847df3\") " pod="metallb-system/metallb-operator-controller-manager-6df4786bd-895gn" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.299398 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/054a38ba-b80d-44df-b84a-e5e3b9847df3-webhook-cert\") pod \"metallb-operator-controller-manager-6df4786bd-895gn\" (UID: \"054a38ba-b80d-44df-b84a-e5e3b9847df3\") " pod="metallb-system/metallb-operator-controller-manager-6df4786bd-895gn" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.306486 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/054a38ba-b80d-44df-b84a-e5e3b9847df3-webhook-cert\") pod \"metallb-operator-controller-manager-6df4786bd-895gn\" (UID: \"054a38ba-b80d-44df-b84a-e5e3b9847df3\") " pod="metallb-system/metallb-operator-controller-manager-6df4786bd-895gn" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.306512 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/054a38ba-b80d-44df-b84a-e5e3b9847df3-apiservice-cert\") pod \"metallb-operator-controller-manager-6df4786bd-895gn\" (UID: \"054a38ba-b80d-44df-b84a-e5e3b9847df3\") " pod="metallb-system/metallb-operator-controller-manager-6df4786bd-895gn" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.316648 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzmfj\" (UniqueName: \"kubernetes.io/projected/054a38ba-b80d-44df-b84a-e5e3b9847df3-kube-api-access-lzmfj\") pod \"metallb-operator-controller-manager-6df4786bd-895gn\" (UID: \"054a38ba-b80d-44df-b84a-e5e3b9847df3\") " pod="metallb-system/metallb-operator-controller-manager-6df4786bd-895gn" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.352663 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6df4786bd-895gn" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.400176 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ccc28183-efb5-4673-8268-44ed1ced4cb7-apiservice-cert\") pod \"metallb-operator-webhook-server-7f987958c8-pm7pp\" (UID: \"ccc28183-efb5-4673-8268-44ed1ced4cb7\") " pod="metallb-system/metallb-operator-webhook-server-7f987958c8-pm7pp" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.400228 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w9j2\" (UniqueName: \"kubernetes.io/projected/ccc28183-efb5-4673-8268-44ed1ced4cb7-kube-api-access-6w9j2\") pod \"metallb-operator-webhook-server-7f987958c8-pm7pp\" (UID: \"ccc28183-efb5-4673-8268-44ed1ced4cb7\") " pod="metallb-system/metallb-operator-webhook-server-7f987958c8-pm7pp" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.400384 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ccc28183-efb5-4673-8268-44ed1ced4cb7-webhook-cert\") pod \"metallb-operator-webhook-server-7f987958c8-pm7pp\" (UID: \"ccc28183-efb5-4673-8268-44ed1ced4cb7\") " pod="metallb-system/metallb-operator-webhook-server-7f987958c8-pm7pp" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.501726 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ccc28183-efb5-4673-8268-44ed1ced4cb7-webhook-cert\") pod \"metallb-operator-webhook-server-7f987958c8-pm7pp\" (UID: \"ccc28183-efb5-4673-8268-44ed1ced4cb7\") " pod="metallb-system/metallb-operator-webhook-server-7f987958c8-pm7pp" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.502034 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ccc28183-efb5-4673-8268-44ed1ced4cb7-apiservice-cert\") pod \"metallb-operator-webhook-server-7f987958c8-pm7pp\" (UID: \"ccc28183-efb5-4673-8268-44ed1ced4cb7\") " pod="metallb-system/metallb-operator-webhook-server-7f987958c8-pm7pp" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.502067 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w9j2\" (UniqueName: \"kubernetes.io/projected/ccc28183-efb5-4673-8268-44ed1ced4cb7-kube-api-access-6w9j2\") pod \"metallb-operator-webhook-server-7f987958c8-pm7pp\" (UID: \"ccc28183-efb5-4673-8268-44ed1ced4cb7\") " pod="metallb-system/metallb-operator-webhook-server-7f987958c8-pm7pp" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.512530 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ccc28183-efb5-4673-8268-44ed1ced4cb7-webhook-cert\") pod \"metallb-operator-webhook-server-7f987958c8-pm7pp\" (UID: \"ccc28183-efb5-4673-8268-44ed1ced4cb7\") " pod="metallb-system/metallb-operator-webhook-server-7f987958c8-pm7pp" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.513367 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ccc28183-efb5-4673-8268-44ed1ced4cb7-apiservice-cert\") pod \"metallb-operator-webhook-server-7f987958c8-pm7pp\" (UID: \"ccc28183-efb5-4673-8268-44ed1ced4cb7\") " pod="metallb-system/metallb-operator-webhook-server-7f987958c8-pm7pp" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.518671 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w9j2\" (UniqueName: \"kubernetes.io/projected/ccc28183-efb5-4673-8268-44ed1ced4cb7-kube-api-access-6w9j2\") pod \"metallb-operator-webhook-server-7f987958c8-pm7pp\" (UID: \"ccc28183-efb5-4673-8268-44ed1ced4cb7\") " pod="metallb-system/metallb-operator-webhook-server-7f987958c8-pm7pp" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.573243 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7f987958c8-pm7pp" Feb 17 09:16:55 crc kubenswrapper[4848]: I0217 09:16:55.627368 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6df4786bd-895gn"] Feb 17 09:16:56 crc kubenswrapper[4848]: I0217 09:16:56.016457 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7f987958c8-pm7pp"] Feb 17 09:16:56 crc kubenswrapper[4848]: W0217 09:16:56.024358 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccc28183_efb5_4673_8268_44ed1ced4cb7.slice/crio-e1100dd9ee68d885346c03493f56c8705a61015acc28916cb0052f81ce545b04 WatchSource:0}: Error finding container e1100dd9ee68d885346c03493f56c8705a61015acc28916cb0052f81ce545b04: Status 404 returned error can't find the container with id e1100dd9ee68d885346c03493f56c8705a61015acc28916cb0052f81ce545b04 Feb 17 09:16:56 crc kubenswrapper[4848]: I0217 09:16:56.495401 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6df4786bd-895gn" event={"ID":"054a38ba-b80d-44df-b84a-e5e3b9847df3","Type":"ContainerStarted","Data":"4b56c09429d42ebeb9e0db247c38cbd31f49f5b83a6f7b5d82feb9c03be7988c"} Feb 17 09:16:56 crc kubenswrapper[4848]: I0217 09:16:56.496432 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7f987958c8-pm7pp" event={"ID":"ccc28183-efb5-4673-8268-44ed1ced4cb7","Type":"ContainerStarted","Data":"e1100dd9ee68d885346c03493f56c8705a61015acc28916cb0052f81ce545b04"} Feb 17 09:17:00 crc kubenswrapper[4848]: I0217 09:17:00.518479 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6df4786bd-895gn" event={"ID":"054a38ba-b80d-44df-b84a-e5e3b9847df3","Type":"ContainerStarted","Data":"2c788d941007f19bb4565b1f1d243cbfdb2275c1f378969e36fda1f77a7f05d6"} Feb 17 09:17:00 crc kubenswrapper[4848]: I0217 09:17:00.518948 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6df4786bd-895gn" Feb 17 09:17:00 crc kubenswrapper[4848]: I0217 09:17:00.522212 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7f987958c8-pm7pp" event={"ID":"ccc28183-efb5-4673-8268-44ed1ced4cb7","Type":"ContainerStarted","Data":"8dc00f58d9de44d491ee15576380448697f5bb200691f1fac8c3cebcb4ee0d9e"} Feb 17 09:17:00 crc kubenswrapper[4848]: I0217 09:17:00.522389 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7f987958c8-pm7pp" Feb 17 09:17:00 crc kubenswrapper[4848]: I0217 09:17:00.540806 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6df4786bd-895gn" podStartSLOduration=1.287713862 podStartE2EDuration="5.540788428s" podCreationTimestamp="2026-02-17 09:16:55 +0000 UTC" firstStartedPulling="2026-02-17 09:16:55.644100666 +0000 UTC m=+693.187356312" lastFinishedPulling="2026-02-17 09:16:59.897175222 +0000 UTC m=+697.440430878" observedRunningTime="2026-02-17 09:17:00.537841396 +0000 UTC m=+698.081097052" watchObservedRunningTime="2026-02-17 09:17:00.540788428 +0000 UTC m=+698.084044084" Feb 17 09:17:00 crc kubenswrapper[4848]: I0217 09:17:00.568689 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7f987958c8-pm7pp" podStartSLOduration=1.680802344 podStartE2EDuration="5.568673602s" podCreationTimestamp="2026-02-17 09:16:55 +0000 UTC" firstStartedPulling="2026-02-17 09:16:56.029263878 +0000 UTC m=+693.572519524" lastFinishedPulling="2026-02-17 09:16:59.917135136 +0000 UTC m=+697.460390782" observedRunningTime="2026-02-17 09:17:00.56497214 +0000 UTC m=+698.108227796" watchObservedRunningTime="2026-02-17 09:17:00.568673602 +0000 UTC m=+698.111929258" Feb 17 09:17:15 crc kubenswrapper[4848]: I0217 09:17:15.653195 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7f987958c8-pm7pp" Feb 17 09:17:35 crc kubenswrapper[4848]: I0217 09:17:35.354859 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6df4786bd-895gn" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.080129 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-hrx8s"] Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.082716 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.088741 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-5l7zm"] Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.089446 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5l7zm" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.101625 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-t5pfw" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.101831 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.101943 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.102396 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.120165 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-5l7zm"] Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.160372 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zsdl\" (UniqueName: \"kubernetes.io/projected/c229235f-b879-43bc-9b19-b4196264d1ec-kube-api-access-4zsdl\") pod \"frr-k8s-webhook-server-78b44bf5bb-5l7zm\" (UID: \"c229235f-b879-43bc-9b19-b4196264d1ec\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5l7zm" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.160421 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fe3f8c76-b77b-410c-830a-24fb19a0de6a-metrics-certs\") pod \"frr-k8s-hrx8s\" (UID: \"fe3f8c76-b77b-410c-830a-24fb19a0de6a\") " pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.160445 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s5pc\" (UniqueName: \"kubernetes.io/projected/fe3f8c76-b77b-410c-830a-24fb19a0de6a-kube-api-access-2s5pc\") pod \"frr-k8s-hrx8s\" (UID: \"fe3f8c76-b77b-410c-830a-24fb19a0de6a\") " pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.160464 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c229235f-b879-43bc-9b19-b4196264d1ec-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-5l7zm\" (UID: \"c229235f-b879-43bc-9b19-b4196264d1ec\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5l7zm" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.160489 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/fe3f8c76-b77b-410c-830a-24fb19a0de6a-frr-startup\") pod \"frr-k8s-hrx8s\" (UID: \"fe3f8c76-b77b-410c-830a-24fb19a0de6a\") " pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.160505 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/fe3f8c76-b77b-410c-830a-24fb19a0de6a-frr-sockets\") pod \"frr-k8s-hrx8s\" (UID: \"fe3f8c76-b77b-410c-830a-24fb19a0de6a\") " pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.160533 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/fe3f8c76-b77b-410c-830a-24fb19a0de6a-metrics\") pod \"frr-k8s-hrx8s\" (UID: \"fe3f8c76-b77b-410c-830a-24fb19a0de6a\") " pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.160545 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/fe3f8c76-b77b-410c-830a-24fb19a0de6a-frr-conf\") pod \"frr-k8s-hrx8s\" (UID: \"fe3f8c76-b77b-410c-830a-24fb19a0de6a\") " pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.160577 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/fe3f8c76-b77b-410c-830a-24fb19a0de6a-reloader\") pod \"frr-k8s-hrx8s\" (UID: \"fe3f8c76-b77b-410c-830a-24fb19a0de6a\") " pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.195326 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-lmz9q"] Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.196151 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-lmz9q" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.197789 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.197826 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.198063 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-dc5hr" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.198096 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.218394 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-69bbfbf88f-7cbd2"] Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.219315 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-7cbd2" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.220575 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.229966 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-7cbd2"] Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.261347 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/36953889-0f59-4c5e-a666-c80389e18bf8-memberlist\") pod \"speaker-lmz9q\" (UID: \"36953889-0f59-4c5e-a666-c80389e18bf8\") " pod="metallb-system/speaker-lmz9q" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.261406 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/fe3f8c76-b77b-410c-830a-24fb19a0de6a-reloader\") pod \"frr-k8s-hrx8s\" (UID: \"fe3f8c76-b77b-410c-830a-24fb19a0de6a\") " pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.261438 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zsdl\" (UniqueName: \"kubernetes.io/projected/c229235f-b879-43bc-9b19-b4196264d1ec-kube-api-access-4zsdl\") pod \"frr-k8s-webhook-server-78b44bf5bb-5l7zm\" (UID: \"c229235f-b879-43bc-9b19-b4196264d1ec\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5l7zm" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.261454 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36953889-0f59-4c5e-a666-c80389e18bf8-metrics-certs\") pod \"speaker-lmz9q\" (UID: \"36953889-0f59-4c5e-a666-c80389e18bf8\") " pod="metallb-system/speaker-lmz9q" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.261476 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fe3f8c76-b77b-410c-830a-24fb19a0de6a-metrics-certs\") pod \"frr-k8s-hrx8s\" (UID: \"fe3f8c76-b77b-410c-830a-24fb19a0de6a\") " pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.261498 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2s5pc\" (UniqueName: \"kubernetes.io/projected/fe3f8c76-b77b-410c-830a-24fb19a0de6a-kube-api-access-2s5pc\") pod \"frr-k8s-hrx8s\" (UID: \"fe3f8c76-b77b-410c-830a-24fb19a0de6a\") " pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.261515 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c229235f-b879-43bc-9b19-b4196264d1ec-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-5l7zm\" (UID: \"c229235f-b879-43bc-9b19-b4196264d1ec\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5l7zm" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.261530 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zdg2\" (UniqueName: \"kubernetes.io/projected/36953889-0f59-4c5e-a666-c80389e18bf8-kube-api-access-2zdg2\") pod \"speaker-lmz9q\" (UID: \"36953889-0f59-4c5e-a666-c80389e18bf8\") " pod="metallb-system/speaker-lmz9q" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.261554 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/fe3f8c76-b77b-410c-830a-24fb19a0de6a-frr-startup\") pod \"frr-k8s-hrx8s\" (UID: \"fe3f8c76-b77b-410c-830a-24fb19a0de6a\") " pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.261568 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/fe3f8c76-b77b-410c-830a-24fb19a0de6a-frr-sockets\") pod \"frr-k8s-hrx8s\" (UID: \"fe3f8c76-b77b-410c-830a-24fb19a0de6a\") " pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.261594 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/fe3f8c76-b77b-410c-830a-24fb19a0de6a-metrics\") pod \"frr-k8s-hrx8s\" (UID: \"fe3f8c76-b77b-410c-830a-24fb19a0de6a\") " pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.261608 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/fe3f8c76-b77b-410c-830a-24fb19a0de6a-frr-conf\") pod \"frr-k8s-hrx8s\" (UID: \"fe3f8c76-b77b-410c-830a-24fb19a0de6a\") " pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.261625 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/36953889-0f59-4c5e-a666-c80389e18bf8-metallb-excludel2\") pod \"speaker-lmz9q\" (UID: \"36953889-0f59-4c5e-a666-c80389e18bf8\") " pod="metallb-system/speaker-lmz9q" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.262040 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/fe3f8c76-b77b-410c-830a-24fb19a0de6a-reloader\") pod \"frr-k8s-hrx8s\" (UID: \"fe3f8c76-b77b-410c-830a-24fb19a0de6a\") " pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:36 crc kubenswrapper[4848]: E0217 09:17:36.262111 4848 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.262218 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/fe3f8c76-b77b-410c-830a-24fb19a0de6a-metrics\") pod \"frr-k8s-hrx8s\" (UID: \"fe3f8c76-b77b-410c-830a-24fb19a0de6a\") " pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:36 crc kubenswrapper[4848]: E0217 09:17:36.262115 4848 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Feb 17 09:17:36 crc kubenswrapper[4848]: E0217 09:17:36.262226 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c229235f-b879-43bc-9b19-b4196264d1ec-cert podName:c229235f-b879-43bc-9b19-b4196264d1ec nodeName:}" failed. No retries permitted until 2026-02-17 09:17:36.762204558 +0000 UTC m=+734.305460204 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c229235f-b879-43bc-9b19-b4196264d1ec-cert") pod "frr-k8s-webhook-server-78b44bf5bb-5l7zm" (UID: "c229235f-b879-43bc-9b19-b4196264d1ec") : secret "frr-k8s-webhook-server-cert" not found Feb 17 09:17:36 crc kubenswrapper[4848]: E0217 09:17:36.262384 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fe3f8c76-b77b-410c-830a-24fb19a0de6a-metrics-certs podName:fe3f8c76-b77b-410c-830a-24fb19a0de6a nodeName:}" failed. No retries permitted until 2026-02-17 09:17:36.762374663 +0000 UTC m=+734.305630309 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fe3f8c76-b77b-410c-830a-24fb19a0de6a-metrics-certs") pod "frr-k8s-hrx8s" (UID: "fe3f8c76-b77b-410c-830a-24fb19a0de6a") : secret "frr-k8s-certs-secret" not found Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.262406 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/fe3f8c76-b77b-410c-830a-24fb19a0de6a-frr-conf\") pod \"frr-k8s-hrx8s\" (UID: \"fe3f8c76-b77b-410c-830a-24fb19a0de6a\") " pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.262799 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/fe3f8c76-b77b-410c-830a-24fb19a0de6a-frr-sockets\") pod \"frr-k8s-hrx8s\" (UID: \"fe3f8c76-b77b-410c-830a-24fb19a0de6a\") " pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.263069 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/fe3f8c76-b77b-410c-830a-24fb19a0de6a-frr-startup\") pod \"frr-k8s-hrx8s\" (UID: \"fe3f8c76-b77b-410c-830a-24fb19a0de6a\") " pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.280482 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s5pc\" (UniqueName: \"kubernetes.io/projected/fe3f8c76-b77b-410c-830a-24fb19a0de6a-kube-api-access-2s5pc\") pod \"frr-k8s-hrx8s\" (UID: \"fe3f8c76-b77b-410c-830a-24fb19a0de6a\") " pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.286642 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zsdl\" (UniqueName: \"kubernetes.io/projected/c229235f-b879-43bc-9b19-b4196264d1ec-kube-api-access-4zsdl\") pod \"frr-k8s-webhook-server-78b44bf5bb-5l7zm\" (UID: \"c229235f-b879-43bc-9b19-b4196264d1ec\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5l7zm" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.362854 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36953889-0f59-4c5e-a666-c80389e18bf8-metrics-certs\") pod \"speaker-lmz9q\" (UID: \"36953889-0f59-4c5e-a666-c80389e18bf8\") " pod="metallb-system/speaker-lmz9q" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.362928 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zdg2\" (UniqueName: \"kubernetes.io/projected/36953889-0f59-4c5e-a666-c80389e18bf8-kube-api-access-2zdg2\") pod \"speaker-lmz9q\" (UID: \"36953889-0f59-4c5e-a666-c80389e18bf8\") " pod="metallb-system/speaker-lmz9q" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.362957 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ff272d7-4c99-464c-819d-b7b22fc8be06-metrics-certs\") pod \"controller-69bbfbf88f-7cbd2\" (UID: \"8ff272d7-4c99-464c-819d-b7b22fc8be06\") " pod="metallb-system/controller-69bbfbf88f-7cbd2" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.362991 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/36953889-0f59-4c5e-a666-c80389e18bf8-metallb-excludel2\") pod \"speaker-lmz9q\" (UID: \"36953889-0f59-4c5e-a666-c80389e18bf8\") " pod="metallb-system/speaker-lmz9q" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.363023 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x264p\" (UniqueName: \"kubernetes.io/projected/8ff272d7-4c99-464c-819d-b7b22fc8be06-kube-api-access-x264p\") pod \"controller-69bbfbf88f-7cbd2\" (UID: \"8ff272d7-4c99-464c-819d-b7b22fc8be06\") " pod="metallb-system/controller-69bbfbf88f-7cbd2" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.363039 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/36953889-0f59-4c5e-a666-c80389e18bf8-memberlist\") pod \"speaker-lmz9q\" (UID: \"36953889-0f59-4c5e-a666-c80389e18bf8\") " pod="metallb-system/speaker-lmz9q" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.363058 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ff272d7-4c99-464c-819d-b7b22fc8be06-cert\") pod \"controller-69bbfbf88f-7cbd2\" (UID: \"8ff272d7-4c99-464c-819d-b7b22fc8be06\") " pod="metallb-system/controller-69bbfbf88f-7cbd2" Feb 17 09:17:36 crc kubenswrapper[4848]: E0217 09:17:36.363170 4848 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Feb 17 09:17:36 crc kubenswrapper[4848]: E0217 09:17:36.363210 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36953889-0f59-4c5e-a666-c80389e18bf8-metrics-certs podName:36953889-0f59-4c5e-a666-c80389e18bf8 nodeName:}" failed. No retries permitted until 2026-02-17 09:17:36.863194823 +0000 UTC m=+734.406450469 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/36953889-0f59-4c5e-a666-c80389e18bf8-metrics-certs") pod "speaker-lmz9q" (UID: "36953889-0f59-4c5e-a666-c80389e18bf8") : secret "speaker-certs-secret" not found Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.364089 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/36953889-0f59-4c5e-a666-c80389e18bf8-metallb-excludel2\") pod \"speaker-lmz9q\" (UID: \"36953889-0f59-4c5e-a666-c80389e18bf8\") " pod="metallb-system/speaker-lmz9q" Feb 17 09:17:36 crc kubenswrapper[4848]: E0217 09:17:36.364155 4848 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 17 09:17:36 crc kubenswrapper[4848]: E0217 09:17:36.364178 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36953889-0f59-4c5e-a666-c80389e18bf8-memberlist podName:36953889-0f59-4c5e-a666-c80389e18bf8 nodeName:}" failed. No retries permitted until 2026-02-17 09:17:36.864170671 +0000 UTC m=+734.407426317 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/36953889-0f59-4c5e-a666-c80389e18bf8-memberlist") pod "speaker-lmz9q" (UID: "36953889-0f59-4c5e-a666-c80389e18bf8") : secret "metallb-memberlist" not found Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.380182 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zdg2\" (UniqueName: \"kubernetes.io/projected/36953889-0f59-4c5e-a666-c80389e18bf8-kube-api-access-2zdg2\") pod \"speaker-lmz9q\" (UID: \"36953889-0f59-4c5e-a666-c80389e18bf8\") " pod="metallb-system/speaker-lmz9q" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.463975 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x264p\" (UniqueName: \"kubernetes.io/projected/8ff272d7-4c99-464c-819d-b7b22fc8be06-kube-api-access-x264p\") pod \"controller-69bbfbf88f-7cbd2\" (UID: \"8ff272d7-4c99-464c-819d-b7b22fc8be06\") " pod="metallb-system/controller-69bbfbf88f-7cbd2" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.464034 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ff272d7-4c99-464c-819d-b7b22fc8be06-cert\") pod \"controller-69bbfbf88f-7cbd2\" (UID: \"8ff272d7-4c99-464c-819d-b7b22fc8be06\") " pod="metallb-system/controller-69bbfbf88f-7cbd2" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.464121 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ff272d7-4c99-464c-819d-b7b22fc8be06-metrics-certs\") pod \"controller-69bbfbf88f-7cbd2\" (UID: \"8ff272d7-4c99-464c-819d-b7b22fc8be06\") " pod="metallb-system/controller-69bbfbf88f-7cbd2" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.466802 4848 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.467692 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8ff272d7-4c99-464c-819d-b7b22fc8be06-metrics-certs\") pod \"controller-69bbfbf88f-7cbd2\" (UID: \"8ff272d7-4c99-464c-819d-b7b22fc8be06\") " pod="metallb-system/controller-69bbfbf88f-7cbd2" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.477702 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ff272d7-4c99-464c-819d-b7b22fc8be06-cert\") pod \"controller-69bbfbf88f-7cbd2\" (UID: \"8ff272d7-4c99-464c-819d-b7b22fc8be06\") " pod="metallb-system/controller-69bbfbf88f-7cbd2" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.483620 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x264p\" (UniqueName: \"kubernetes.io/projected/8ff272d7-4c99-464c-819d-b7b22fc8be06-kube-api-access-x264p\") pod \"controller-69bbfbf88f-7cbd2\" (UID: \"8ff272d7-4c99-464c-819d-b7b22fc8be06\") " pod="metallb-system/controller-69bbfbf88f-7cbd2" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.535653 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-69bbfbf88f-7cbd2" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.748736 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-69bbfbf88f-7cbd2"] Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.767565 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fe3f8c76-b77b-410c-830a-24fb19a0de6a-metrics-certs\") pod \"frr-k8s-hrx8s\" (UID: \"fe3f8c76-b77b-410c-830a-24fb19a0de6a\") " pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.767616 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c229235f-b879-43bc-9b19-b4196264d1ec-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-5l7zm\" (UID: \"c229235f-b879-43bc-9b19-b4196264d1ec\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5l7zm" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.772580 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fe3f8c76-b77b-410c-830a-24fb19a0de6a-metrics-certs\") pod \"frr-k8s-hrx8s\" (UID: \"fe3f8c76-b77b-410c-830a-24fb19a0de6a\") " pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.773577 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c229235f-b879-43bc-9b19-b4196264d1ec-cert\") pod \"frr-k8s-webhook-server-78b44bf5bb-5l7zm\" (UID: \"c229235f-b879-43bc-9b19-b4196264d1ec\") " pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5l7zm" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.793055 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-7cbd2" event={"ID":"8ff272d7-4c99-464c-819d-b7b22fc8be06","Type":"ContainerStarted","Data":"7e9c58a367793ee9323b8c532f040c58448abc240328324a3611f2087eeaccbc"} Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.869213 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/36953889-0f59-4c5e-a666-c80389e18bf8-memberlist\") pod \"speaker-lmz9q\" (UID: \"36953889-0f59-4c5e-a666-c80389e18bf8\") " pod="metallb-system/speaker-lmz9q" Feb 17 09:17:36 crc kubenswrapper[4848]: E0217 09:17:36.869442 4848 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 17 09:17:36 crc kubenswrapper[4848]: E0217 09:17:36.869538 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/36953889-0f59-4c5e-a666-c80389e18bf8-memberlist podName:36953889-0f59-4c5e-a666-c80389e18bf8 nodeName:}" failed. No retries permitted until 2026-02-17 09:17:37.869506234 +0000 UTC m=+735.412761920 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/36953889-0f59-4c5e-a666-c80389e18bf8-memberlist") pod "speaker-lmz9q" (UID: "36953889-0f59-4c5e-a666-c80389e18bf8") : secret "metallb-memberlist" not found Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.869753 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36953889-0f59-4c5e-a666-c80389e18bf8-metrics-certs\") pod \"speaker-lmz9q\" (UID: \"36953889-0f59-4c5e-a666-c80389e18bf8\") " pod="metallb-system/speaker-lmz9q" Feb 17 09:17:36 crc kubenswrapper[4848]: I0217 09:17:36.874635 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/36953889-0f59-4c5e-a666-c80389e18bf8-metrics-certs\") pod \"speaker-lmz9q\" (UID: \"36953889-0f59-4c5e-a666-c80389e18bf8\") " pod="metallb-system/speaker-lmz9q" Feb 17 09:17:37 crc kubenswrapper[4848]: I0217 09:17:37.021096 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:37 crc kubenswrapper[4848]: I0217 09:17:37.036444 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5l7zm" Feb 17 09:17:37 crc kubenswrapper[4848]: I0217 09:17:37.256682 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-78b44bf5bb-5l7zm"] Feb 17 09:17:37 crc kubenswrapper[4848]: W0217 09:17:37.259350 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc229235f_b879_43bc_9b19_b4196264d1ec.slice/crio-83d7b1150a4ac26b8afc54b15eaa9e5ec7e55ddac7be6d8493fa77c222e43cc8 WatchSource:0}: Error finding container 83d7b1150a4ac26b8afc54b15eaa9e5ec7e55ddac7be6d8493fa77c222e43cc8: Status 404 returned error can't find the container with id 83d7b1150a4ac26b8afc54b15eaa9e5ec7e55ddac7be6d8493fa77c222e43cc8 Feb 17 09:17:37 crc kubenswrapper[4848]: I0217 09:17:37.806190 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hrx8s" event={"ID":"fe3f8c76-b77b-410c-830a-24fb19a0de6a","Type":"ContainerStarted","Data":"a3e5a3aaa5c6560ef54f86e79e08d24ce5df57f9b2bf1defd0deb974be9b6297"} Feb 17 09:17:37 crc kubenswrapper[4848]: I0217 09:17:37.808125 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5l7zm" event={"ID":"c229235f-b879-43bc-9b19-b4196264d1ec","Type":"ContainerStarted","Data":"83d7b1150a4ac26b8afc54b15eaa9e5ec7e55ddac7be6d8493fa77c222e43cc8"} Feb 17 09:17:37 crc kubenswrapper[4848]: I0217 09:17:37.811789 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-7cbd2" event={"ID":"8ff272d7-4c99-464c-819d-b7b22fc8be06","Type":"ContainerStarted","Data":"107c53d11a3b8311c34843209b4d67f0f4c17d7402dd5e166e90480d4b07f5dd"} Feb 17 09:17:37 crc kubenswrapper[4848]: I0217 09:17:37.811819 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-69bbfbf88f-7cbd2" event={"ID":"8ff272d7-4c99-464c-819d-b7b22fc8be06","Type":"ContainerStarted","Data":"992079780e43f4049d6e6116abffb543595624e9806ea4647b645032c2828da0"} Feb 17 09:17:37 crc kubenswrapper[4848]: I0217 09:17:37.811976 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-69bbfbf88f-7cbd2" Feb 17 09:17:37 crc kubenswrapper[4848]: I0217 09:17:37.840007 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-69bbfbf88f-7cbd2" podStartSLOduration=1.8399864940000001 podStartE2EDuration="1.839986494s" podCreationTimestamp="2026-02-17 09:17:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:17:37.835064362 +0000 UTC m=+735.378320008" watchObservedRunningTime="2026-02-17 09:17:37.839986494 +0000 UTC m=+735.383242140" Feb 17 09:17:37 crc kubenswrapper[4848]: I0217 09:17:37.889481 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/36953889-0f59-4c5e-a666-c80389e18bf8-memberlist\") pod \"speaker-lmz9q\" (UID: \"36953889-0f59-4c5e-a666-c80389e18bf8\") " pod="metallb-system/speaker-lmz9q" Feb 17 09:17:37 crc kubenswrapper[4848]: I0217 09:17:37.895090 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/36953889-0f59-4c5e-a666-c80389e18bf8-memberlist\") pod \"speaker-lmz9q\" (UID: \"36953889-0f59-4c5e-a666-c80389e18bf8\") " pod="metallb-system/speaker-lmz9q" Feb 17 09:17:38 crc kubenswrapper[4848]: I0217 09:17:38.009821 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-lmz9q" Feb 17 09:17:38 crc kubenswrapper[4848]: W0217 09:17:38.033782 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36953889_0f59_4c5e_a666_c80389e18bf8.slice/crio-6e07c2991e15caaf56a1ab0cfcb6f103cf292551c957deb606a18b65cb495482 WatchSource:0}: Error finding container 6e07c2991e15caaf56a1ab0cfcb6f103cf292551c957deb606a18b65cb495482: Status 404 returned error can't find the container with id 6e07c2991e15caaf56a1ab0cfcb6f103cf292551c957deb606a18b65cb495482 Feb 17 09:17:38 crc kubenswrapper[4848]: I0217 09:17:38.820794 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lmz9q" event={"ID":"36953889-0f59-4c5e-a666-c80389e18bf8","Type":"ContainerStarted","Data":"5d72ccd00458a93a7efc495dc3854b6a75649c71ff3feca549d6332bb8c471cb"} Feb 17 09:17:38 crc kubenswrapper[4848]: I0217 09:17:38.821100 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lmz9q" event={"ID":"36953889-0f59-4c5e-a666-c80389e18bf8","Type":"ContainerStarted","Data":"359c96e2ce89b55df7ce7f4dec48e9d837009988f52127b3d420dddbb99fc7b6"} Feb 17 09:17:38 crc kubenswrapper[4848]: I0217 09:17:38.821114 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lmz9q" event={"ID":"36953889-0f59-4c5e-a666-c80389e18bf8","Type":"ContainerStarted","Data":"6e07c2991e15caaf56a1ab0cfcb6f103cf292551c957deb606a18b65cb495482"} Feb 17 09:17:38 crc kubenswrapper[4848]: I0217 09:17:38.821281 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-lmz9q" Feb 17 09:17:38 crc kubenswrapper[4848]: I0217 09:17:38.857541 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-lmz9q" podStartSLOduration=2.857520967 podStartE2EDuration="2.857520967s" podCreationTimestamp="2026-02-17 09:17:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:17:38.838711906 +0000 UTC m=+736.381967562" watchObservedRunningTime="2026-02-17 09:17:38.857520967 +0000 UTC m=+736.400776613" Feb 17 09:17:44 crc kubenswrapper[4848]: I0217 09:17:44.869396 4848 generic.go:334] "Generic (PLEG): container finished" podID="fe3f8c76-b77b-410c-830a-24fb19a0de6a" containerID="1ed31b2ba614d42146fc18822425a11b482f15e8ffe110d2c552c2ec6d97afb4" exitCode=0 Feb 17 09:17:44 crc kubenswrapper[4848]: I0217 09:17:44.869534 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hrx8s" event={"ID":"fe3f8c76-b77b-410c-830a-24fb19a0de6a","Type":"ContainerDied","Data":"1ed31b2ba614d42146fc18822425a11b482f15e8ffe110d2c552c2ec6d97afb4"} Feb 17 09:17:44 crc kubenswrapper[4848]: I0217 09:17:44.873992 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5l7zm" event={"ID":"c229235f-b879-43bc-9b19-b4196264d1ec","Type":"ContainerStarted","Data":"5ab5338c1d6b5a61c7ef19562a1d7f40c0e1e135ae6118d56fb7ba25767c9f3b"} Feb 17 09:17:44 crc kubenswrapper[4848]: I0217 09:17:44.874450 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5l7zm" Feb 17 09:17:44 crc kubenswrapper[4848]: I0217 09:17:44.935252 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5l7zm" podStartSLOduration=2.372164099 podStartE2EDuration="8.935228323s" podCreationTimestamp="2026-02-17 09:17:36 +0000 UTC" firstStartedPulling="2026-02-17 09:17:37.261170708 +0000 UTC m=+734.804426354" lastFinishedPulling="2026-02-17 09:17:43.824234922 +0000 UTC m=+741.367490578" observedRunningTime="2026-02-17 09:17:44.932463284 +0000 UTC m=+742.475718940" watchObservedRunningTime="2026-02-17 09:17:44.935228323 +0000 UTC m=+742.478483989" Feb 17 09:17:45 crc kubenswrapper[4848]: I0217 09:17:45.885247 4848 generic.go:334] "Generic (PLEG): container finished" podID="fe3f8c76-b77b-410c-830a-24fb19a0de6a" containerID="eb21c78c9c6536c55343f7cdca740e1df8acab111b571f304e0bedcb287b13c4" exitCode=0 Feb 17 09:17:45 crc kubenswrapper[4848]: I0217 09:17:45.885319 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hrx8s" event={"ID":"fe3f8c76-b77b-410c-830a-24fb19a0de6a","Type":"ContainerDied","Data":"eb21c78c9c6536c55343f7cdca740e1df8acab111b571f304e0bedcb287b13c4"} Feb 17 09:17:46 crc kubenswrapper[4848]: I0217 09:17:46.539693 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-69bbfbf88f-7cbd2" Feb 17 09:17:46 crc kubenswrapper[4848]: I0217 09:17:46.893621 4848 generic.go:334] "Generic (PLEG): container finished" podID="fe3f8c76-b77b-410c-830a-24fb19a0de6a" containerID="b538e5accc3aeac09248547f1a77afb9153afeb58930f196017a7d094b4c567e" exitCode=0 Feb 17 09:17:46 crc kubenswrapper[4848]: I0217 09:17:46.893709 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hrx8s" event={"ID":"fe3f8c76-b77b-410c-830a-24fb19a0de6a","Type":"ContainerDied","Data":"b538e5accc3aeac09248547f1a77afb9153afeb58930f196017a7d094b4c567e"} Feb 17 09:17:47 crc kubenswrapper[4848]: I0217 09:17:47.906615 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hrx8s" event={"ID":"fe3f8c76-b77b-410c-830a-24fb19a0de6a","Type":"ContainerStarted","Data":"189202f7e7afc3ca80d51da8c324708a0fa424d7cecd81f6a1760a72c60453d2"} Feb 17 09:17:47 crc kubenswrapper[4848]: I0217 09:17:47.907000 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hrx8s" event={"ID":"fe3f8c76-b77b-410c-830a-24fb19a0de6a","Type":"ContainerStarted","Data":"d10364221557eae15a33031cb8956e75c32977bc3e56b41b830af074b951c082"} Feb 17 09:17:47 crc kubenswrapper[4848]: I0217 09:17:47.907023 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hrx8s" event={"ID":"fe3f8c76-b77b-410c-830a-24fb19a0de6a","Type":"ContainerStarted","Data":"dad705c77b0497a1c073fb617a234aebd21035f02e7abc9875ee99364316e54f"} Feb 17 09:17:47 crc kubenswrapper[4848]: I0217 09:17:47.907042 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hrx8s" event={"ID":"fe3f8c76-b77b-410c-830a-24fb19a0de6a","Type":"ContainerStarted","Data":"ce3ac575436131c784fe0664c3513c990c0734f5f21dff3a5b4eac15a0f5ca84"} Feb 17 09:17:47 crc kubenswrapper[4848]: I0217 09:17:47.907062 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hrx8s" event={"ID":"fe3f8c76-b77b-410c-830a-24fb19a0de6a","Type":"ContainerStarted","Data":"30a57c83bfcf1f3e9838dd184b23685266c2b2cb56f9991dfb42ca5af0a4d3f6"} Feb 17 09:17:48 crc kubenswrapper[4848]: I0217 09:17:48.013441 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-lmz9q" Feb 17 09:17:48 crc kubenswrapper[4848]: I0217 09:17:48.920374 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-hrx8s" event={"ID":"fe3f8c76-b77b-410c-830a-24fb19a0de6a","Type":"ContainerStarted","Data":"6d4972400cdcd014f3c6be0d9957d452931e2129de682f8dc60cfe1c4139f0e7"} Feb 17 09:17:48 crc kubenswrapper[4848]: I0217 09:17:48.920693 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:48 crc kubenswrapper[4848]: I0217 09:17:48.950308 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-hrx8s" podStartSLOduration=6.305484445 podStartE2EDuration="12.950286261s" podCreationTimestamp="2026-02-17 09:17:36 +0000 UTC" firstStartedPulling="2026-02-17 09:17:37.179239711 +0000 UTC m=+734.722495357" lastFinishedPulling="2026-02-17 09:17:43.824041527 +0000 UTC m=+741.367297173" observedRunningTime="2026-02-17 09:17:48.947693126 +0000 UTC m=+746.490948802" watchObservedRunningTime="2026-02-17 09:17:48.950286261 +0000 UTC m=+746.493541947" Feb 17 09:17:50 crc kubenswrapper[4848]: I0217 09:17:50.805417 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-r5tp9"] Feb 17 09:17:50 crc kubenswrapper[4848]: I0217 09:17:50.806694 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-r5tp9" Feb 17 09:17:50 crc kubenswrapper[4848]: I0217 09:17:50.808473 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-dbvd7" Feb 17 09:17:50 crc kubenswrapper[4848]: I0217 09:17:50.811724 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 17 09:17:50 crc kubenswrapper[4848]: I0217 09:17:50.811724 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 17 09:17:50 crc kubenswrapper[4848]: I0217 09:17:50.825453 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-r5tp9"] Feb 17 09:17:50 crc kubenswrapper[4848]: I0217 09:17:50.884561 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chtjf\" (UniqueName: \"kubernetes.io/projected/549f9474-efed-4da0-9ea1-b910917ba420-kube-api-access-chtjf\") pod \"openstack-operator-index-r5tp9\" (UID: \"549f9474-efed-4da0-9ea1-b910917ba420\") " pod="openstack-operators/openstack-operator-index-r5tp9" Feb 17 09:17:50 crc kubenswrapper[4848]: I0217 09:17:50.986155 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chtjf\" (UniqueName: \"kubernetes.io/projected/549f9474-efed-4da0-9ea1-b910917ba420-kube-api-access-chtjf\") pod \"openstack-operator-index-r5tp9\" (UID: \"549f9474-efed-4da0-9ea1-b910917ba420\") " pod="openstack-operators/openstack-operator-index-r5tp9" Feb 17 09:17:51 crc kubenswrapper[4848]: I0217 09:17:51.006861 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chtjf\" (UniqueName: \"kubernetes.io/projected/549f9474-efed-4da0-9ea1-b910917ba420-kube-api-access-chtjf\") pod \"openstack-operator-index-r5tp9\" (UID: \"549f9474-efed-4da0-9ea1-b910917ba420\") " pod="openstack-operators/openstack-operator-index-r5tp9" Feb 17 09:17:51 crc kubenswrapper[4848]: I0217 09:17:51.133140 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-r5tp9" Feb 17 09:17:51 crc kubenswrapper[4848]: I0217 09:17:51.570591 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-r5tp9"] Feb 17 09:17:51 crc kubenswrapper[4848]: W0217 09:17:51.582216 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod549f9474_efed_4da0_9ea1_b910917ba420.slice/crio-033acc5aae0657dc2d1167db75d77927b136f63d03e85690076d1262352b7d26 WatchSource:0}: Error finding container 033acc5aae0657dc2d1167db75d77927b136f63d03e85690076d1262352b7d26: Status 404 returned error can't find the container with id 033acc5aae0657dc2d1167db75d77927b136f63d03e85690076d1262352b7d26 Feb 17 09:17:51 crc kubenswrapper[4848]: I0217 09:17:51.942097 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-r5tp9" event={"ID":"549f9474-efed-4da0-9ea1-b910917ba420","Type":"ContainerStarted","Data":"033acc5aae0657dc2d1167db75d77927b136f63d03e85690076d1262352b7d26"} Feb 17 09:17:52 crc kubenswrapper[4848]: I0217 09:17:52.022383 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:52 crc kubenswrapper[4848]: I0217 09:17:52.055797 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:52 crc kubenswrapper[4848]: I0217 09:17:52.952270 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-r5tp9" event={"ID":"549f9474-efed-4da0-9ea1-b910917ba420","Type":"ContainerStarted","Data":"600b71f80499eb75c15fd886201a82c25937a4ca821c33fceefe0e7265179c41"} Feb 17 09:17:52 crc kubenswrapper[4848]: I0217 09:17:52.987815 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-r5tp9" podStartSLOduration=2.033201402 podStartE2EDuration="2.987750903s" podCreationTimestamp="2026-02-17 09:17:50 +0000 UTC" firstStartedPulling="2026-02-17 09:17:51.584557059 +0000 UTC m=+749.127812705" lastFinishedPulling="2026-02-17 09:17:52.53910656 +0000 UTC m=+750.082362206" observedRunningTime="2026-02-17 09:17:52.987481475 +0000 UTC m=+750.530737161" watchObservedRunningTime="2026-02-17 09:17:52.987750903 +0000 UTC m=+750.531006589" Feb 17 09:17:53 crc kubenswrapper[4848]: I0217 09:17:53.992253 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-r5tp9"] Feb 17 09:17:54 crc kubenswrapper[4848]: I0217 09:17:54.599362 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-b2s6b"] Feb 17 09:17:54 crc kubenswrapper[4848]: I0217 09:17:54.600818 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-b2s6b" Feb 17 09:17:54 crc kubenswrapper[4848]: I0217 09:17:54.627154 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-b2s6b"] Feb 17 09:17:54 crc kubenswrapper[4848]: I0217 09:17:54.735429 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frv9d\" (UniqueName: \"kubernetes.io/projected/7b87255a-321f-4b26-bc23-a7d5aeff53e2-kube-api-access-frv9d\") pod \"openstack-operator-index-b2s6b\" (UID: \"7b87255a-321f-4b26-bc23-a7d5aeff53e2\") " pod="openstack-operators/openstack-operator-index-b2s6b" Feb 17 09:17:54 crc kubenswrapper[4848]: I0217 09:17:54.836377 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frv9d\" (UniqueName: \"kubernetes.io/projected/7b87255a-321f-4b26-bc23-a7d5aeff53e2-kube-api-access-frv9d\") pod \"openstack-operator-index-b2s6b\" (UID: \"7b87255a-321f-4b26-bc23-a7d5aeff53e2\") " pod="openstack-operators/openstack-operator-index-b2s6b" Feb 17 09:17:54 crc kubenswrapper[4848]: I0217 09:17:54.869383 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frv9d\" (UniqueName: \"kubernetes.io/projected/7b87255a-321f-4b26-bc23-a7d5aeff53e2-kube-api-access-frv9d\") pod \"openstack-operator-index-b2s6b\" (UID: \"7b87255a-321f-4b26-bc23-a7d5aeff53e2\") " pod="openstack-operators/openstack-operator-index-b2s6b" Feb 17 09:17:54 crc kubenswrapper[4848]: I0217 09:17:54.951337 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-b2s6b" Feb 17 09:17:54 crc kubenswrapper[4848]: I0217 09:17:54.965017 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-r5tp9" podUID="549f9474-efed-4da0-9ea1-b910917ba420" containerName="registry-server" containerID="cri-o://600b71f80499eb75c15fd886201a82c25937a4ca821c33fceefe0e7265179c41" gracePeriod=2 Feb 17 09:17:55 crc kubenswrapper[4848]: I0217 09:17:55.416177 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-b2s6b"] Feb 17 09:17:55 crc kubenswrapper[4848]: W0217 09:17:55.422089 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b87255a_321f_4b26_bc23_a7d5aeff53e2.slice/crio-972a26acdb371a87c0ffb5d542c12417b13c00290c701726364900b7ca50b967 WatchSource:0}: Error finding container 972a26acdb371a87c0ffb5d542c12417b13c00290c701726364900b7ca50b967: Status 404 returned error can't find the container with id 972a26acdb371a87c0ffb5d542c12417b13c00290c701726364900b7ca50b967 Feb 17 09:17:55 crc kubenswrapper[4848]: I0217 09:17:55.865448 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-r5tp9" Feb 17 09:17:55 crc kubenswrapper[4848]: I0217 09:17:55.953713 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chtjf\" (UniqueName: \"kubernetes.io/projected/549f9474-efed-4da0-9ea1-b910917ba420-kube-api-access-chtjf\") pod \"549f9474-efed-4da0-9ea1-b910917ba420\" (UID: \"549f9474-efed-4da0-9ea1-b910917ba420\") " Feb 17 09:17:55 crc kubenswrapper[4848]: I0217 09:17:55.960292 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/549f9474-efed-4da0-9ea1-b910917ba420-kube-api-access-chtjf" (OuterVolumeSpecName: "kube-api-access-chtjf") pod "549f9474-efed-4da0-9ea1-b910917ba420" (UID: "549f9474-efed-4da0-9ea1-b910917ba420"). InnerVolumeSpecName "kube-api-access-chtjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:17:55 crc kubenswrapper[4848]: I0217 09:17:55.974332 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-b2s6b" event={"ID":"7b87255a-321f-4b26-bc23-a7d5aeff53e2","Type":"ContainerStarted","Data":"972a26acdb371a87c0ffb5d542c12417b13c00290c701726364900b7ca50b967"} Feb 17 09:17:55 crc kubenswrapper[4848]: I0217 09:17:55.976962 4848 generic.go:334] "Generic (PLEG): container finished" podID="549f9474-efed-4da0-9ea1-b910917ba420" containerID="600b71f80499eb75c15fd886201a82c25937a4ca821c33fceefe0e7265179c41" exitCode=0 Feb 17 09:17:55 crc kubenswrapper[4848]: I0217 09:17:55.977042 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-r5tp9" event={"ID":"549f9474-efed-4da0-9ea1-b910917ba420","Type":"ContainerDied","Data":"600b71f80499eb75c15fd886201a82c25937a4ca821c33fceefe0e7265179c41"} Feb 17 09:17:55 crc kubenswrapper[4848]: I0217 09:17:55.977095 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-r5tp9" event={"ID":"549f9474-efed-4da0-9ea1-b910917ba420","Type":"ContainerDied","Data":"033acc5aae0657dc2d1167db75d77927b136f63d03e85690076d1262352b7d26"} Feb 17 09:17:55 crc kubenswrapper[4848]: I0217 09:17:55.977133 4848 scope.go:117] "RemoveContainer" containerID="600b71f80499eb75c15fd886201a82c25937a4ca821c33fceefe0e7265179c41" Feb 17 09:17:55 crc kubenswrapper[4848]: I0217 09:17:55.977255 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-r5tp9" Feb 17 09:17:56 crc kubenswrapper[4848]: I0217 09:17:56.011071 4848 scope.go:117] "RemoveContainer" containerID="600b71f80499eb75c15fd886201a82c25937a4ca821c33fceefe0e7265179c41" Feb 17 09:17:56 crc kubenswrapper[4848]: E0217 09:17:56.011944 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"600b71f80499eb75c15fd886201a82c25937a4ca821c33fceefe0e7265179c41\": container with ID starting with 600b71f80499eb75c15fd886201a82c25937a4ca821c33fceefe0e7265179c41 not found: ID does not exist" containerID="600b71f80499eb75c15fd886201a82c25937a4ca821c33fceefe0e7265179c41" Feb 17 09:17:56 crc kubenswrapper[4848]: I0217 09:17:56.012217 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"600b71f80499eb75c15fd886201a82c25937a4ca821c33fceefe0e7265179c41"} err="failed to get container status \"600b71f80499eb75c15fd886201a82c25937a4ca821c33fceefe0e7265179c41\": rpc error: code = NotFound desc = could not find container \"600b71f80499eb75c15fd886201a82c25937a4ca821c33fceefe0e7265179c41\": container with ID starting with 600b71f80499eb75c15fd886201a82c25937a4ca821c33fceefe0e7265179c41 not found: ID does not exist" Feb 17 09:17:56 crc kubenswrapper[4848]: I0217 09:17:56.016418 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-r5tp9"] Feb 17 09:17:56 crc kubenswrapper[4848]: I0217 09:17:56.024013 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-r5tp9"] Feb 17 09:17:56 crc kubenswrapper[4848]: I0217 09:17:56.056024 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chtjf\" (UniqueName: \"kubernetes.io/projected/549f9474-efed-4da0-9ea1-b910917ba420-kube-api-access-chtjf\") on node \"crc\" DevicePath \"\"" Feb 17 09:17:56 crc kubenswrapper[4848]: I0217 09:17:56.990469 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-b2s6b" event={"ID":"7b87255a-321f-4b26-bc23-a7d5aeff53e2","Type":"ContainerStarted","Data":"01eb9c6eed157e63594117f45775712099aa48900b8cd4a2dcffc050e1778801"} Feb 17 09:17:57 crc kubenswrapper[4848]: I0217 09:17:57.016022 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-b2s6b" podStartSLOduration=2.613108313 podStartE2EDuration="3.01599913s" podCreationTimestamp="2026-02-17 09:17:54 +0000 UTC" firstStartedPulling="2026-02-17 09:17:55.425620942 +0000 UTC m=+752.968876618" lastFinishedPulling="2026-02-17 09:17:55.828511749 +0000 UTC m=+753.371767435" observedRunningTime="2026-02-17 09:17:57.011553252 +0000 UTC m=+754.554808938" watchObservedRunningTime="2026-02-17 09:17:57.01599913 +0000 UTC m=+754.559254776" Feb 17 09:17:57 crc kubenswrapper[4848]: I0217 09:17:57.024167 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-hrx8s" Feb 17 09:17:57 crc kubenswrapper[4848]: I0217 09:17:57.044942 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-78b44bf5bb-5l7zm" Feb 17 09:17:57 crc kubenswrapper[4848]: I0217 09:17:57.391703 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="549f9474-efed-4da0-9ea1-b910917ba420" path="/var/lib/kubelet/pods/549f9474-efed-4da0-9ea1-b910917ba420/volumes" Feb 17 09:17:58 crc kubenswrapper[4848]: I0217 09:17:58.262537 4848 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 17 09:18:04 crc kubenswrapper[4848]: I0217 09:18:04.952321 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-b2s6b" Feb 17 09:18:04 crc kubenswrapper[4848]: I0217 09:18:04.952926 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-b2s6b" Feb 17 09:18:05 crc kubenswrapper[4848]: I0217 09:18:05.001976 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-b2s6b" Feb 17 09:18:05 crc kubenswrapper[4848]: I0217 09:18:05.094654 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-b2s6b" Feb 17 09:18:06 crc kubenswrapper[4848]: I0217 09:18:06.040843 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw"] Feb 17 09:18:06 crc kubenswrapper[4848]: E0217 09:18:06.041314 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="549f9474-efed-4da0-9ea1-b910917ba420" containerName="registry-server" Feb 17 09:18:06 crc kubenswrapper[4848]: I0217 09:18:06.041335 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="549f9474-efed-4da0-9ea1-b910917ba420" containerName="registry-server" Feb 17 09:18:06 crc kubenswrapper[4848]: I0217 09:18:06.041554 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="549f9474-efed-4da0-9ea1-b910917ba420" containerName="registry-server" Feb 17 09:18:06 crc kubenswrapper[4848]: I0217 09:18:06.042937 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw" Feb 17 09:18:06 crc kubenswrapper[4848]: I0217 09:18:06.044889 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-hkws7" Feb 17 09:18:06 crc kubenswrapper[4848]: I0217 09:18:06.060226 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw"] Feb 17 09:18:06 crc kubenswrapper[4848]: I0217 09:18:06.094730 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0eda6969-cb8d-4b90-84a0-606b61156a05-bundle\") pod \"4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw\" (UID: \"0eda6969-cb8d-4b90-84a0-606b61156a05\") " pod="openstack-operators/4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw" Feb 17 09:18:06 crc kubenswrapper[4848]: I0217 09:18:06.094807 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzcr5\" (UniqueName: \"kubernetes.io/projected/0eda6969-cb8d-4b90-84a0-606b61156a05-kube-api-access-zzcr5\") pod \"4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw\" (UID: \"0eda6969-cb8d-4b90-84a0-606b61156a05\") " pod="openstack-operators/4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw" Feb 17 09:18:06 crc kubenswrapper[4848]: I0217 09:18:06.094848 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0eda6969-cb8d-4b90-84a0-606b61156a05-util\") pod \"4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw\" (UID: \"0eda6969-cb8d-4b90-84a0-606b61156a05\") " pod="openstack-operators/4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw" Feb 17 09:18:06 crc kubenswrapper[4848]: I0217 09:18:06.195657 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0eda6969-cb8d-4b90-84a0-606b61156a05-bundle\") pod \"4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw\" (UID: \"0eda6969-cb8d-4b90-84a0-606b61156a05\") " pod="openstack-operators/4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw" Feb 17 09:18:06 crc kubenswrapper[4848]: I0217 09:18:06.195706 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzcr5\" (UniqueName: \"kubernetes.io/projected/0eda6969-cb8d-4b90-84a0-606b61156a05-kube-api-access-zzcr5\") pod \"4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw\" (UID: \"0eda6969-cb8d-4b90-84a0-606b61156a05\") " pod="openstack-operators/4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw" Feb 17 09:18:06 crc kubenswrapper[4848]: I0217 09:18:06.195750 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0eda6969-cb8d-4b90-84a0-606b61156a05-util\") pod \"4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw\" (UID: \"0eda6969-cb8d-4b90-84a0-606b61156a05\") " pod="openstack-operators/4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw" Feb 17 09:18:06 crc kubenswrapper[4848]: I0217 09:18:06.196201 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0eda6969-cb8d-4b90-84a0-606b61156a05-bundle\") pod \"4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw\" (UID: \"0eda6969-cb8d-4b90-84a0-606b61156a05\") " pod="openstack-operators/4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw" Feb 17 09:18:06 crc kubenswrapper[4848]: I0217 09:18:06.196272 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0eda6969-cb8d-4b90-84a0-606b61156a05-util\") pod \"4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw\" (UID: \"0eda6969-cb8d-4b90-84a0-606b61156a05\") " pod="openstack-operators/4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw" Feb 17 09:18:06 crc kubenswrapper[4848]: I0217 09:18:06.222581 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzcr5\" (UniqueName: \"kubernetes.io/projected/0eda6969-cb8d-4b90-84a0-606b61156a05-kube-api-access-zzcr5\") pod \"4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw\" (UID: \"0eda6969-cb8d-4b90-84a0-606b61156a05\") " pod="openstack-operators/4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw" Feb 17 09:18:06 crc kubenswrapper[4848]: I0217 09:18:06.365408 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw" Feb 17 09:18:06 crc kubenswrapper[4848]: I0217 09:18:06.620655 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw"] Feb 17 09:18:06 crc kubenswrapper[4848]: W0217 09:18:06.622451 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0eda6969_cb8d_4b90_84a0_606b61156a05.slice/crio-061180dc83f518ecb2c3ec96f667f11d0eea72008f723de553e94fa84ab85615 WatchSource:0}: Error finding container 061180dc83f518ecb2c3ec96f667f11d0eea72008f723de553e94fa84ab85615: Status 404 returned error can't find the container with id 061180dc83f518ecb2c3ec96f667f11d0eea72008f723de553e94fa84ab85615 Feb 17 09:18:07 crc kubenswrapper[4848]: I0217 09:18:07.096588 4848 generic.go:334] "Generic (PLEG): container finished" podID="0eda6969-cb8d-4b90-84a0-606b61156a05" containerID="952f87dc32ae5cb91d8a2dba40ed15fb6f5ae2473faea52f6783a22deb27876c" exitCode=0 Feb 17 09:18:07 crc kubenswrapper[4848]: I0217 09:18:07.096708 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw" event={"ID":"0eda6969-cb8d-4b90-84a0-606b61156a05","Type":"ContainerDied","Data":"952f87dc32ae5cb91d8a2dba40ed15fb6f5ae2473faea52f6783a22deb27876c"} Feb 17 09:18:07 crc kubenswrapper[4848]: I0217 09:18:07.097426 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw" event={"ID":"0eda6969-cb8d-4b90-84a0-606b61156a05","Type":"ContainerStarted","Data":"061180dc83f518ecb2c3ec96f667f11d0eea72008f723de553e94fa84ab85615"} Feb 17 09:18:08 crc kubenswrapper[4848]: I0217 09:18:08.109491 4848 generic.go:334] "Generic (PLEG): container finished" podID="0eda6969-cb8d-4b90-84a0-606b61156a05" containerID="df8a5dd172f8dc0bbe5a017ff46c8ccd58895804d5dcbd4f97a82946d1547248" exitCode=0 Feb 17 09:18:08 crc kubenswrapper[4848]: I0217 09:18:08.109553 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw" event={"ID":"0eda6969-cb8d-4b90-84a0-606b61156a05","Type":"ContainerDied","Data":"df8a5dd172f8dc0bbe5a017ff46c8ccd58895804d5dcbd4f97a82946d1547248"} Feb 17 09:18:09 crc kubenswrapper[4848]: I0217 09:18:09.120398 4848 generic.go:334] "Generic (PLEG): container finished" podID="0eda6969-cb8d-4b90-84a0-606b61156a05" containerID="73976b9df6caa5e0e428b8b27460610b48aab049f5c239130595ee44b653bcd1" exitCode=0 Feb 17 09:18:09 crc kubenswrapper[4848]: I0217 09:18:09.120450 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw" event={"ID":"0eda6969-cb8d-4b90-84a0-606b61156a05","Type":"ContainerDied","Data":"73976b9df6caa5e0e428b8b27460610b48aab049f5c239130595ee44b653bcd1"} Feb 17 09:18:10 crc kubenswrapper[4848]: I0217 09:18:10.439744 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw" Feb 17 09:18:10 crc kubenswrapper[4848]: I0217 09:18:10.560443 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0eda6969-cb8d-4b90-84a0-606b61156a05-util\") pod \"0eda6969-cb8d-4b90-84a0-606b61156a05\" (UID: \"0eda6969-cb8d-4b90-84a0-606b61156a05\") " Feb 17 09:18:10 crc kubenswrapper[4848]: I0217 09:18:10.560549 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzcr5\" (UniqueName: \"kubernetes.io/projected/0eda6969-cb8d-4b90-84a0-606b61156a05-kube-api-access-zzcr5\") pod \"0eda6969-cb8d-4b90-84a0-606b61156a05\" (UID: \"0eda6969-cb8d-4b90-84a0-606b61156a05\") " Feb 17 09:18:10 crc kubenswrapper[4848]: I0217 09:18:10.560664 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0eda6969-cb8d-4b90-84a0-606b61156a05-bundle\") pod \"0eda6969-cb8d-4b90-84a0-606b61156a05\" (UID: \"0eda6969-cb8d-4b90-84a0-606b61156a05\") " Feb 17 09:18:10 crc kubenswrapper[4848]: I0217 09:18:10.562144 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0eda6969-cb8d-4b90-84a0-606b61156a05-bundle" (OuterVolumeSpecName: "bundle") pod "0eda6969-cb8d-4b90-84a0-606b61156a05" (UID: "0eda6969-cb8d-4b90-84a0-606b61156a05"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:18:10 crc kubenswrapper[4848]: I0217 09:18:10.569839 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0eda6969-cb8d-4b90-84a0-606b61156a05-kube-api-access-zzcr5" (OuterVolumeSpecName: "kube-api-access-zzcr5") pod "0eda6969-cb8d-4b90-84a0-606b61156a05" (UID: "0eda6969-cb8d-4b90-84a0-606b61156a05"). InnerVolumeSpecName "kube-api-access-zzcr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:18:10 crc kubenswrapper[4848]: I0217 09:18:10.579625 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0eda6969-cb8d-4b90-84a0-606b61156a05-util" (OuterVolumeSpecName: "util") pod "0eda6969-cb8d-4b90-84a0-606b61156a05" (UID: "0eda6969-cb8d-4b90-84a0-606b61156a05"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:18:10 crc kubenswrapper[4848]: I0217 09:18:10.662440 4848 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0eda6969-cb8d-4b90-84a0-606b61156a05-util\") on node \"crc\" DevicePath \"\"" Feb 17 09:18:10 crc kubenswrapper[4848]: I0217 09:18:10.662480 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzcr5\" (UniqueName: \"kubernetes.io/projected/0eda6969-cb8d-4b90-84a0-606b61156a05-kube-api-access-zzcr5\") on node \"crc\" DevicePath \"\"" Feb 17 09:18:10 crc kubenswrapper[4848]: I0217 09:18:10.662496 4848 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0eda6969-cb8d-4b90-84a0-606b61156a05-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:18:11 crc kubenswrapper[4848]: I0217 09:18:11.140476 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw" event={"ID":"0eda6969-cb8d-4b90-84a0-606b61156a05","Type":"ContainerDied","Data":"061180dc83f518ecb2c3ec96f667f11d0eea72008f723de553e94fa84ab85615"} Feb 17 09:18:11 crc kubenswrapper[4848]: I0217 09:18:11.140528 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="061180dc83f518ecb2c3ec96f667f11d0eea72008f723de553e94fa84ab85615" Feb 17 09:18:11 crc kubenswrapper[4848]: I0217 09:18:11.141068 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw" Feb 17 09:18:12 crc kubenswrapper[4848]: I0217 09:18:12.969973 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-7f8db498b4-ps9gz"] Feb 17 09:18:12 crc kubenswrapper[4848]: E0217 09:18:12.971302 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eda6969-cb8d-4b90-84a0-606b61156a05" containerName="util" Feb 17 09:18:12 crc kubenswrapper[4848]: I0217 09:18:12.971439 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eda6969-cb8d-4b90-84a0-606b61156a05" containerName="util" Feb 17 09:18:12 crc kubenswrapper[4848]: E0217 09:18:12.971551 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eda6969-cb8d-4b90-84a0-606b61156a05" containerName="pull" Feb 17 09:18:12 crc kubenswrapper[4848]: I0217 09:18:12.971637 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eda6969-cb8d-4b90-84a0-606b61156a05" containerName="pull" Feb 17 09:18:12 crc kubenswrapper[4848]: E0217 09:18:12.971727 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0eda6969-cb8d-4b90-84a0-606b61156a05" containerName="extract" Feb 17 09:18:12 crc kubenswrapper[4848]: I0217 09:18:12.971832 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="0eda6969-cb8d-4b90-84a0-606b61156a05" containerName="extract" Feb 17 09:18:12 crc kubenswrapper[4848]: I0217 09:18:12.972072 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="0eda6969-cb8d-4b90-84a0-606b61156a05" containerName="extract" Feb 17 09:18:12 crc kubenswrapper[4848]: I0217 09:18:12.972650 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7f8db498b4-ps9gz" Feb 17 09:18:12 crc kubenswrapper[4848]: I0217 09:18:12.975800 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-hl8jz" Feb 17 09:18:13 crc kubenswrapper[4848]: I0217 09:18:13.005250 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7f8db498b4-ps9gz"] Feb 17 09:18:13 crc kubenswrapper[4848]: I0217 09:18:13.096417 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fhhg\" (UniqueName: \"kubernetes.io/projected/4fa72ae4-db9b-4093-92bf-7aeb4ab8b7ab-kube-api-access-8fhhg\") pod \"openstack-operator-controller-init-7f8db498b4-ps9gz\" (UID: \"4fa72ae4-db9b-4093-92bf-7aeb4ab8b7ab\") " pod="openstack-operators/openstack-operator-controller-init-7f8db498b4-ps9gz" Feb 17 09:18:13 crc kubenswrapper[4848]: I0217 09:18:13.198364 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fhhg\" (UniqueName: \"kubernetes.io/projected/4fa72ae4-db9b-4093-92bf-7aeb4ab8b7ab-kube-api-access-8fhhg\") pod \"openstack-operator-controller-init-7f8db498b4-ps9gz\" (UID: \"4fa72ae4-db9b-4093-92bf-7aeb4ab8b7ab\") " pod="openstack-operators/openstack-operator-controller-init-7f8db498b4-ps9gz" Feb 17 09:18:13 crc kubenswrapper[4848]: I0217 09:18:13.219030 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fhhg\" (UniqueName: \"kubernetes.io/projected/4fa72ae4-db9b-4093-92bf-7aeb4ab8b7ab-kube-api-access-8fhhg\") pod \"openstack-operator-controller-init-7f8db498b4-ps9gz\" (UID: \"4fa72ae4-db9b-4093-92bf-7aeb4ab8b7ab\") " pod="openstack-operators/openstack-operator-controller-init-7f8db498b4-ps9gz" Feb 17 09:18:13 crc kubenswrapper[4848]: I0217 09:18:13.294579 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-7f8db498b4-ps9gz" Feb 17 09:18:13 crc kubenswrapper[4848]: I0217 09:18:13.742169 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-7f8db498b4-ps9gz"] Feb 17 09:18:14 crc kubenswrapper[4848]: I0217 09:18:14.158634 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7f8db498b4-ps9gz" event={"ID":"4fa72ae4-db9b-4093-92bf-7aeb4ab8b7ab","Type":"ContainerStarted","Data":"c3c17368d6a3d941542aae1ac12c8da38e032d30cabea579aa0bea7056051953"} Feb 17 09:18:18 crc kubenswrapper[4848]: I0217 09:18:18.772091 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:18:18 crc kubenswrapper[4848]: I0217 09:18:18.772641 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:18:19 crc kubenswrapper[4848]: I0217 09:18:19.193824 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-7f8db498b4-ps9gz" event={"ID":"4fa72ae4-db9b-4093-92bf-7aeb4ab8b7ab","Type":"ContainerStarted","Data":"c5f43c6248e4769a5df73b7f073e7b57d5268cbe0b8b10278afa97954f322895"} Feb 17 09:18:19 crc kubenswrapper[4848]: I0217 09:18:19.194298 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-7f8db498b4-ps9gz" Feb 17 09:18:19 crc kubenswrapper[4848]: I0217 09:18:19.246585 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-7f8db498b4-ps9gz" podStartSLOduration=2.7751319309999998 podStartE2EDuration="7.246557954s" podCreationTimestamp="2026-02-17 09:18:12 +0000 UTC" firstStartedPulling="2026-02-17 09:18:13.74406796 +0000 UTC m=+771.287323646" lastFinishedPulling="2026-02-17 09:18:18.215494023 +0000 UTC m=+775.758749669" observedRunningTime="2026-02-17 09:18:19.240333095 +0000 UTC m=+776.783588781" watchObservedRunningTime="2026-02-17 09:18:19.246557954 +0000 UTC m=+776.789813630" Feb 17 09:18:23 crc kubenswrapper[4848]: I0217 09:18:23.300529 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-7f8db498b4-ps9gz" Feb 17 09:18:43 crc kubenswrapper[4848]: I0217 09:18:43.946556 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-vqb7z"] Feb 17 09:18:43 crc kubenswrapper[4848]: I0217 09:18:43.948140 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-vqb7z" Feb 17 09:18:43 crc kubenswrapper[4848]: I0217 09:18:43.952095 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-mkbdg" Feb 17 09:18:43 crc kubenswrapper[4848]: I0217 09:18:43.956050 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-wlll6"] Feb 17 09:18:43 crc kubenswrapper[4848]: I0217 09:18:43.956780 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-wlll6" Feb 17 09:18:43 crc kubenswrapper[4848]: I0217 09:18:43.960466 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-vqf78" Feb 17 09:18:43 crc kubenswrapper[4848]: I0217 09:18:43.961734 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-wlll6"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.004897 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-958lw"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.006034 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-958lw" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.014335 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-dmhjq" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.019990 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-fbwmm"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.021010 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-fbwmm" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.042674 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-rtts8" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.068175 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-vqb7z"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.116843 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlz44\" (UniqueName: \"kubernetes.io/projected/cf85b89f-2556-4ee7-a12b-6a4379f962e9-kube-api-access-jlz44\") pod \"barbican-operator-controller-manager-868647ff47-vqb7z\" (UID: \"cf85b89f-2556-4ee7-a12b-6a4379f962e9\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-vqb7z" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.116904 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7whk\" (UniqueName: \"kubernetes.io/projected/17a4dcbd-4735-48d6-a575-f7d3af6843f1-kube-api-access-b7whk\") pod \"glance-operator-controller-manager-77987464f4-958lw\" (UID: \"17a4dcbd-4735-48d6-a575-f7d3af6843f1\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-958lw" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.116933 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82gx2\" (UniqueName: \"kubernetes.io/projected/b2e407ed-c962-4fcf-b367-f4164d644de6-kube-api-access-82gx2\") pod \"cinder-operator-controller-manager-5d946d989d-wlll6\" (UID: \"b2e407ed-c962-4fcf-b367-f4164d644de6\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-wlll6" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.129788 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-fbwmm"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.141853 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-958lw"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.148498 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-86z5g"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.149266 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-86z5g" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.151050 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-lc6w8" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.154352 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-dlmq4"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.155097 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-dlmq4" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.160039 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.161580 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-7qm2w" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.172715 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5lpnr"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.173322 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5lpnr" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.177371 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-45p25" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.197487 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-86z5g"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.218271 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlz44\" (UniqueName: \"kubernetes.io/projected/cf85b89f-2556-4ee7-a12b-6a4379f962e9-kube-api-access-jlz44\") pod \"barbican-operator-controller-manager-868647ff47-vqb7z\" (UID: \"cf85b89f-2556-4ee7-a12b-6a4379f962e9\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-vqb7z" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.218311 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7whk\" (UniqueName: \"kubernetes.io/projected/17a4dcbd-4735-48d6-a575-f7d3af6843f1-kube-api-access-b7whk\") pod \"glance-operator-controller-manager-77987464f4-958lw\" (UID: \"17a4dcbd-4735-48d6-a575-f7d3af6843f1\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-958lw" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.218343 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cd8rx\" (UniqueName: \"kubernetes.io/projected/05876a75-9b3e-45b7-a3fe-89ab569742fd-kube-api-access-cd8rx\") pod \"designate-operator-controller-manager-6d8bf5c495-fbwmm\" (UID: \"05876a75-9b3e-45b7-a3fe-89ab569742fd\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-fbwmm" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.218366 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrn82\" (UniqueName: \"kubernetes.io/projected/32c32c38-9ebf-4e9a-bea8-e761159dda5f-kube-api-access-qrn82\") pod \"heat-operator-controller-manager-69f49c598c-86z5g\" (UID: \"32c32c38-9ebf-4e9a-bea8-e761159dda5f\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-86z5g" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.218384 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82gx2\" (UniqueName: \"kubernetes.io/projected/b2e407ed-c962-4fcf-b367-f4164d644de6-kube-api-access-82gx2\") pod \"cinder-operator-controller-manager-5d946d989d-wlll6\" (UID: \"b2e407ed-c962-4fcf-b367-f4164d644de6\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-wlll6" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.218402 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2de98b6-28a9-446d-bc9b-ac7aad58be7d-cert\") pod \"infra-operator-controller-manager-79d975b745-dlmq4\" (UID: \"a2de98b6-28a9-446d-bc9b-ac7aad58be7d\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dlmq4" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.218433 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppnnh\" (UniqueName: \"kubernetes.io/projected/ce2d3288-2b7d-4db8-861d-0a413fc90222-kube-api-access-ppnnh\") pod \"horizon-operator-controller-manager-5b9b8895d5-5lpnr\" (UID: \"ce2d3288-2b7d-4db8-861d-0a413fc90222\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5lpnr" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.218460 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swddc\" (UniqueName: \"kubernetes.io/projected/a2de98b6-28a9-446d-bc9b-ac7aad58be7d-kube-api-access-swddc\") pod \"infra-operator-controller-manager-79d975b745-dlmq4\" (UID: \"a2de98b6-28a9-446d-bc9b-ac7aad58be7d\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dlmq4" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.231995 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5lpnr"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.244577 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-sw5bf"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.245513 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-sw5bf" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.250013 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-mwm6f" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.254655 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlz44\" (UniqueName: \"kubernetes.io/projected/cf85b89f-2556-4ee7-a12b-6a4379f962e9-kube-api-access-jlz44\") pod \"barbican-operator-controller-manager-868647ff47-vqb7z\" (UID: \"cf85b89f-2556-4ee7-a12b-6a4379f962e9\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-vqb7z" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.260251 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-dlmq4"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.261056 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7whk\" (UniqueName: \"kubernetes.io/projected/17a4dcbd-4735-48d6-a575-f7d3af6843f1-kube-api-access-b7whk\") pod \"glance-operator-controller-manager-77987464f4-958lw\" (UID: \"17a4dcbd-4735-48d6-a575-f7d3af6843f1\") " pod="openstack-operators/glance-operator-controller-manager-77987464f4-958lw" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.261183 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82gx2\" (UniqueName: \"kubernetes.io/projected/b2e407ed-c962-4fcf-b367-f4164d644de6-kube-api-access-82gx2\") pod \"cinder-operator-controller-manager-5d946d989d-wlll6\" (UID: \"b2e407ed-c962-4fcf-b367-f4164d644de6\") " pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-wlll6" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.291201 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-9wvc8"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.292677 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-9wvc8" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.295026 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-zrhq6" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.296887 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-qd7ds"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.297901 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-qd7ds" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.299398 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-tx8z8" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.308571 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-sw5bf"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.313820 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-qd7ds"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.316890 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-9wvc8"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.319014 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppnnh\" (UniqueName: \"kubernetes.io/projected/ce2d3288-2b7d-4db8-861d-0a413fc90222-kube-api-access-ppnnh\") pod \"horizon-operator-controller-manager-5b9b8895d5-5lpnr\" (UID: \"ce2d3288-2b7d-4db8-861d-0a413fc90222\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5lpnr" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.319062 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swddc\" (UniqueName: \"kubernetes.io/projected/a2de98b6-28a9-446d-bc9b-ac7aad58be7d-kube-api-access-swddc\") pod \"infra-operator-controller-manager-79d975b745-dlmq4\" (UID: \"a2de98b6-28a9-446d-bc9b-ac7aad58be7d\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dlmq4" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.319121 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cd8rx\" (UniqueName: \"kubernetes.io/projected/05876a75-9b3e-45b7-a3fe-89ab569742fd-kube-api-access-cd8rx\") pod \"designate-operator-controller-manager-6d8bf5c495-fbwmm\" (UID: \"05876a75-9b3e-45b7-a3fe-89ab569742fd\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-fbwmm" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.319146 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrn82\" (UniqueName: \"kubernetes.io/projected/32c32c38-9ebf-4e9a-bea8-e761159dda5f-kube-api-access-qrn82\") pod \"heat-operator-controller-manager-69f49c598c-86z5g\" (UID: \"32c32c38-9ebf-4e9a-bea8-e761159dda5f\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-86z5g" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.319171 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2de98b6-28a9-446d-bc9b-ac7aad58be7d-cert\") pod \"infra-operator-controller-manager-79d975b745-dlmq4\" (UID: \"a2de98b6-28a9-446d-bc9b-ac7aad58be7d\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dlmq4" Feb 17 09:18:44 crc kubenswrapper[4848]: E0217 09:18:44.319273 4848 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 09:18:44 crc kubenswrapper[4848]: E0217 09:18:44.319321 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2de98b6-28a9-446d-bc9b-ac7aad58be7d-cert podName:a2de98b6-28a9-446d-bc9b-ac7aad58be7d nodeName:}" failed. No retries permitted until 2026-02-17 09:18:44.819303288 +0000 UTC m=+802.362558924 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a2de98b6-28a9-446d-bc9b-ac7aad58be7d-cert") pod "infra-operator-controller-manager-79d975b745-dlmq4" (UID: "a2de98b6-28a9-446d-bc9b-ac7aad58be7d") : secret "infra-operator-webhook-server-cert" not found Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.320347 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-fkvjd"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.321226 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-fkvjd" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.326602 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-lth6q"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.326988 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-vqb7z" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.337584 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-lth6q"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.337687 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-lth6q" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.338017 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-fkvjd"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.340834 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-sgl9m" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.341682 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-zx79z" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.343787 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swddc\" (UniqueName: \"kubernetes.io/projected/a2de98b6-28a9-446d-bc9b-ac7aad58be7d-kube-api-access-swddc\") pod \"infra-operator-controller-manager-79d975b745-dlmq4\" (UID: \"a2de98b6-28a9-446d-bc9b-ac7aad58be7d\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dlmq4" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.344230 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrn82\" (UniqueName: \"kubernetes.io/projected/32c32c38-9ebf-4e9a-bea8-e761159dda5f-kube-api-access-qrn82\") pod \"heat-operator-controller-manager-69f49c598c-86z5g\" (UID: \"32c32c38-9ebf-4e9a-bea8-e761159dda5f\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-86z5g" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.348938 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-rvzql"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.351210 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-rvzql" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.352044 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppnnh\" (UniqueName: \"kubernetes.io/projected/ce2d3288-2b7d-4db8-861d-0a413fc90222-kube-api-access-ppnnh\") pod \"horizon-operator-controller-manager-5b9b8895d5-5lpnr\" (UID: \"ce2d3288-2b7d-4db8-861d-0a413fc90222\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5lpnr" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.355191 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-rvzql"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.357492 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-mgbfp" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.363522 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-mcnbc"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.364344 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-mcnbc" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.367993 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cd8rx\" (UniqueName: \"kubernetes.io/projected/05876a75-9b3e-45b7-a3fe-89ab569742fd-kube-api-access-cd8rx\") pod \"designate-operator-controller-manager-6d8bf5c495-fbwmm\" (UID: \"05876a75-9b3e-45b7-a3fe-89ab569742fd\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-fbwmm" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.368781 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-mcnbc"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.369143 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-h5zwv" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.384407 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-ckxfq"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.388313 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ckxfq" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.392149 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-k5d5v" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.393402 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.394596 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.396131 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-29gdb" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.396269 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.400872 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-ckxfq"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.414381 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.428692 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kss8j\" (UniqueName: \"kubernetes.io/projected/aa45cd11-5d86-47c3-b46e-15c0b204feb6-kube-api-access-kss8j\") pod \"neutron-operator-controller-manager-64ddbf8bb-fkvjd\" (UID: \"aa45cd11-5d86-47c3-b46e-15c0b204feb6\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-fkvjd" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.428741 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6gb6\" (UniqueName: \"kubernetes.io/projected/e6251943-952f-4cbc-924c-b362d9f7c8da-kube-api-access-z6gb6\") pod \"manila-operator-controller-manager-54f6768c69-qd7ds\" (UID: \"e6251943-952f-4cbc-924c-b362d9f7c8da\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-qd7ds" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.428782 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc5gx\" (UniqueName: \"kubernetes.io/projected/88153939-7ca7-448d-a21c-b8330360b5a1-kube-api-access-pc5gx\") pod \"nova-operator-controller-manager-567668f5cf-rvzql\" (UID: \"88153939-7ca7-448d-a21c-b8330360b5a1\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-rvzql" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.428829 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5tv7\" (UniqueName: \"kubernetes.io/projected/f151e0ea-ac05-426d-aa94-e32cc25fdc09-kube-api-access-l5tv7\") pod \"mariadb-operator-controller-manager-6994f66f48-lth6q\" (UID: \"f151e0ea-ac05-426d-aa94-e32cc25fdc09\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-lth6q" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.428886 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnrqm\" (UniqueName: \"kubernetes.io/projected/f658d1a9-916e-41c9-8268-e94c22c6a045-kube-api-access-jnrqm\") pod \"openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr\" (UID: \"f658d1a9-916e-41c9-8268-e94c22c6a045\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.428909 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65wsg\" (UniqueName: \"kubernetes.io/projected/aebf2b53-aeb7-44a1-a2c2-3dfaa0e01bdd-kube-api-access-65wsg\") pod \"octavia-operator-controller-manager-69f8888797-mcnbc\" (UID: \"aebf2b53-aeb7-44a1-a2c2-3dfaa0e01bdd\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-mcnbc" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.428992 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzqpc\" (UniqueName: \"kubernetes.io/projected/3922bb1d-9f36-4ffc-b382-a54c1c213008-kube-api-access-zzqpc\") pod \"placement-operator-controller-manager-8497b45c89-ckxfq\" (UID: \"3922bb1d-9f36-4ffc-b382-a54c1c213008\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ckxfq" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.429018 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmv22\" (UniqueName: \"kubernetes.io/projected/97430748-300a-434e-a6b3-52274422ab66-kube-api-access-cmv22\") pod \"keystone-operator-controller-manager-b4d948c87-9wvc8\" (UID: \"97430748-300a-434e-a6b3-52274422ab66\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-9wvc8" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.429051 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkrxr\" (UniqueName: \"kubernetes.io/projected/04f9fe37-de58-4b62-896e-0945a7bcbfdf-kube-api-access-wkrxr\") pod \"ironic-operator-controller-manager-554564d7fc-sw5bf\" (UID: \"04f9fe37-de58-4b62-896e-0945a7bcbfdf\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-sw5bf" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.429086 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f658d1a9-916e-41c9-8268-e94c22c6a045-cert\") pod \"openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr\" (UID: \"f658d1a9-916e-41c9-8268-e94c22c6a045\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.433965 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-ckggs"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.435660 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ckggs" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.438803 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-kw7wg" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.458628 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-wlll6" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.470934 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-ckggs"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.472681 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-fbwmm" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.487483 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987464f4-958lw" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.490286 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-mltd5"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.497962 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-mltd5" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.502356 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-d7xrm" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.511558 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-86z5g" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.530614 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-mltd5"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.536543 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzqpc\" (UniqueName: \"kubernetes.io/projected/3922bb1d-9f36-4ffc-b382-a54c1c213008-kube-api-access-zzqpc\") pod \"placement-operator-controller-manager-8497b45c89-ckxfq\" (UID: \"3922bb1d-9f36-4ffc-b382-a54c1c213008\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ckxfq" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.536633 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmv22\" (UniqueName: \"kubernetes.io/projected/97430748-300a-434e-a6b3-52274422ab66-kube-api-access-cmv22\") pod \"keystone-operator-controller-manager-b4d948c87-9wvc8\" (UID: \"97430748-300a-434e-a6b3-52274422ab66\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-9wvc8" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.536707 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkrxr\" (UniqueName: \"kubernetes.io/projected/04f9fe37-de58-4b62-896e-0945a7bcbfdf-kube-api-access-wkrxr\") pod \"ironic-operator-controller-manager-554564d7fc-sw5bf\" (UID: \"04f9fe37-de58-4b62-896e-0945a7bcbfdf\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-sw5bf" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.536817 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrbw8\" (UniqueName: \"kubernetes.io/projected/653da755-b43b-4da9-bfd9-e8ee0bb44cc4-kube-api-access-zrbw8\") pod \"ovn-operator-controller-manager-d44cf6b75-ckggs\" (UID: \"653da755-b43b-4da9-bfd9-e8ee0bb44cc4\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ckggs" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.536882 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f658d1a9-916e-41c9-8268-e94c22c6a045-cert\") pod \"openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr\" (UID: \"f658d1a9-916e-41c9-8268-e94c22c6a045\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.536944 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kss8j\" (UniqueName: \"kubernetes.io/projected/aa45cd11-5d86-47c3-b46e-15c0b204feb6-kube-api-access-kss8j\") pod \"neutron-operator-controller-manager-64ddbf8bb-fkvjd\" (UID: \"aa45cd11-5d86-47c3-b46e-15c0b204feb6\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-fkvjd" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.536969 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6gb6\" (UniqueName: \"kubernetes.io/projected/e6251943-952f-4cbc-924c-b362d9f7c8da-kube-api-access-z6gb6\") pod \"manila-operator-controller-manager-54f6768c69-qd7ds\" (UID: \"e6251943-952f-4cbc-924c-b362d9f7c8da\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-qd7ds" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.537015 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc5gx\" (UniqueName: \"kubernetes.io/projected/88153939-7ca7-448d-a21c-b8330360b5a1-kube-api-access-pc5gx\") pod \"nova-operator-controller-manager-567668f5cf-rvzql\" (UID: \"88153939-7ca7-448d-a21c-b8330360b5a1\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-rvzql" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.537045 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5tv7\" (UniqueName: \"kubernetes.io/projected/f151e0ea-ac05-426d-aa94-e32cc25fdc09-kube-api-access-l5tv7\") pod \"mariadb-operator-controller-manager-6994f66f48-lth6q\" (UID: \"f151e0ea-ac05-426d-aa94-e32cc25fdc09\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-lth6q" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.537115 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnrqm\" (UniqueName: \"kubernetes.io/projected/f658d1a9-916e-41c9-8268-e94c22c6a045-kube-api-access-jnrqm\") pod \"openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr\" (UID: \"f658d1a9-916e-41c9-8268-e94c22c6a045\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.537135 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65wsg\" (UniqueName: \"kubernetes.io/projected/aebf2b53-aeb7-44a1-a2c2-3dfaa0e01bdd-kube-api-access-65wsg\") pod \"octavia-operator-controller-manager-69f8888797-mcnbc\" (UID: \"aebf2b53-aeb7-44a1-a2c2-3dfaa0e01bdd\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-mcnbc" Feb 17 09:18:44 crc kubenswrapper[4848]: E0217 09:18:44.537139 4848 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.537188 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q789p\" (UniqueName: \"kubernetes.io/projected/6b8d9b10-d577-4621-88d4-6f26e692a502-kube-api-access-q789p\") pod \"swift-operator-controller-manager-68f46476f-mltd5\" (UID: \"6b8d9b10-d577-4621-88d4-6f26e692a502\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-mltd5" Feb 17 09:18:44 crc kubenswrapper[4848]: E0217 09:18:44.537207 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f658d1a9-916e-41c9-8268-e94c22c6a045-cert podName:f658d1a9-916e-41c9-8268-e94c22c6a045 nodeName:}" failed. No retries permitted until 2026-02-17 09:18:45.037186118 +0000 UTC m=+802.580441764 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f658d1a9-916e-41c9-8268-e94c22c6a045-cert") pod "openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr" (UID: "f658d1a9-916e-41c9-8268-e94c22c6a045") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.547810 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5lpnr" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.563270 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kss8j\" (UniqueName: \"kubernetes.io/projected/aa45cd11-5d86-47c3-b46e-15c0b204feb6-kube-api-access-kss8j\") pod \"neutron-operator-controller-manager-64ddbf8bb-fkvjd\" (UID: \"aa45cd11-5d86-47c3-b46e-15c0b204feb6\") " pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-fkvjd" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.566580 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65wsg\" (UniqueName: \"kubernetes.io/projected/aebf2b53-aeb7-44a1-a2c2-3dfaa0e01bdd-kube-api-access-65wsg\") pod \"octavia-operator-controller-manager-69f8888797-mcnbc\" (UID: \"aebf2b53-aeb7-44a1-a2c2-3dfaa0e01bdd\") " pod="openstack-operators/octavia-operator-controller-manager-69f8888797-mcnbc" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.567102 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6gb6\" (UniqueName: \"kubernetes.io/projected/e6251943-952f-4cbc-924c-b362d9f7c8da-kube-api-access-z6gb6\") pod \"manila-operator-controller-manager-54f6768c69-qd7ds\" (UID: \"e6251943-952f-4cbc-924c-b362d9f7c8da\") " pod="openstack-operators/manila-operator-controller-manager-54f6768c69-qd7ds" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.569045 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzqpc\" (UniqueName: \"kubernetes.io/projected/3922bb1d-9f36-4ffc-b382-a54c1c213008-kube-api-access-zzqpc\") pod \"placement-operator-controller-manager-8497b45c89-ckxfq\" (UID: \"3922bb1d-9f36-4ffc-b382-a54c1c213008\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ckxfq" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.571299 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5tv7\" (UniqueName: \"kubernetes.io/projected/f151e0ea-ac05-426d-aa94-e32cc25fdc09-kube-api-access-l5tv7\") pod \"mariadb-operator-controller-manager-6994f66f48-lth6q\" (UID: \"f151e0ea-ac05-426d-aa94-e32cc25fdc09\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-lth6q" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.580586 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmv22\" (UniqueName: \"kubernetes.io/projected/97430748-300a-434e-a6b3-52274422ab66-kube-api-access-cmv22\") pod \"keystone-operator-controller-manager-b4d948c87-9wvc8\" (UID: \"97430748-300a-434e-a6b3-52274422ab66\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-9wvc8" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.582905 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnrqm\" (UniqueName: \"kubernetes.io/projected/f658d1a9-916e-41c9-8268-e94c22c6a045-kube-api-access-jnrqm\") pod \"openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr\" (UID: \"f658d1a9-916e-41c9-8268-e94c22c6a045\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.583607 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc5gx\" (UniqueName: \"kubernetes.io/projected/88153939-7ca7-448d-a21c-b8330360b5a1-kube-api-access-pc5gx\") pod \"nova-operator-controller-manager-567668f5cf-rvzql\" (UID: \"88153939-7ca7-448d-a21c-b8330360b5a1\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-rvzql" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.595326 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-jcd2x"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.597865 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-jcd2x" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.605477 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-5ft58" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.605494 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkrxr\" (UniqueName: \"kubernetes.io/projected/04f9fe37-de58-4b62-896e-0945a7bcbfdf-kube-api-access-wkrxr\") pod \"ironic-operator-controller-manager-554564d7fc-sw5bf\" (UID: \"04f9fe37-de58-4b62-896e-0945a7bcbfdf\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-sw5bf" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.607721 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-sw5bf" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.608687 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-jcd2x"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.618851 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-9wvc8" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.636654 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-qd7ds" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.651300 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q789p\" (UniqueName: \"kubernetes.io/projected/6b8d9b10-d577-4621-88d4-6f26e692a502-kube-api-access-q789p\") pod \"swift-operator-controller-manager-68f46476f-mltd5\" (UID: \"6b8d9b10-d577-4621-88d4-6f26e692a502\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-mltd5" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.651395 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrbw8\" (UniqueName: \"kubernetes.io/projected/653da755-b43b-4da9-bfd9-e8ee0bb44cc4-kube-api-access-zrbw8\") pod \"ovn-operator-controller-manager-d44cf6b75-ckggs\" (UID: \"653da755-b43b-4da9-bfd9-e8ee0bb44cc4\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ckggs" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.691570 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrbw8\" (UniqueName: \"kubernetes.io/projected/653da755-b43b-4da9-bfd9-e8ee0bb44cc4-kube-api-access-zrbw8\") pod \"ovn-operator-controller-manager-d44cf6b75-ckggs\" (UID: \"653da755-b43b-4da9-bfd9-e8ee0bb44cc4\") " pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ckggs" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.696954 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-4phd8"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.698094 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-4phd8" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.708932 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-7w8fh" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.713978 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-4phd8"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.722782 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q789p\" (UniqueName: \"kubernetes.io/projected/6b8d9b10-d577-4621-88d4-6f26e692a502-kube-api-access-q789p\") pod \"swift-operator-controller-manager-68f46476f-mltd5\" (UID: \"6b8d9b10-d577-4621-88d4-6f26e692a502\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-mltd5" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.735169 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-fkvjd" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.752999 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw62j\" (UniqueName: \"kubernetes.io/projected/bced4dcb-9bee-42a5-9e52-e6ddc83f8f06-kube-api-access-qw62j\") pod \"telemetry-operator-controller-manager-7f45b4ff68-jcd2x\" (UID: \"bced4dcb-9bee-42a5-9e52-e6ddc83f8f06\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-jcd2x" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.783792 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-7swkp"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.785295 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-7swkp"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.785412 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-7swkp" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.794636 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-frsbz" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.794836 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-rvzql" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.795385 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-lth6q" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.806249 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-mcnbc" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.806810 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-vqb7z"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.813705 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.814646 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.815713 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ckxfq" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.821706 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-5z82p" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.822954 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.823087 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.824480 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.842407 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhb2h"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.843325 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhb2h" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.846264 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-djzk7" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.847377 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhb2h"] Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.854042 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw62j\" (UniqueName: \"kubernetes.io/projected/bced4dcb-9bee-42a5-9e52-e6ddc83f8f06-kube-api-access-qw62j\") pod \"telemetry-operator-controller-manager-7f45b4ff68-jcd2x\" (UID: \"bced4dcb-9bee-42a5-9e52-e6ddc83f8f06\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-jcd2x" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.854100 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57pps\" (UniqueName: \"kubernetes.io/projected/632a67ba-e5ae-43bb-a69e-49cf64c054e4-kube-api-access-57pps\") pod \"test-operator-controller-manager-7866795846-4phd8\" (UID: \"632a67ba-e5ae-43bb-a69e-49cf64c054e4\") " pod="openstack-operators/test-operator-controller-manager-7866795846-4phd8" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.854144 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2de98b6-28a9-446d-bc9b-ac7aad58be7d-cert\") pod \"infra-operator-controller-manager-79d975b745-dlmq4\" (UID: \"a2de98b6-28a9-446d-bc9b-ac7aad58be7d\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dlmq4" Feb 17 09:18:44 crc kubenswrapper[4848]: E0217 09:18:44.855261 4848 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 09:18:44 crc kubenswrapper[4848]: E0217 09:18:44.855313 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2de98b6-28a9-446d-bc9b-ac7aad58be7d-cert podName:a2de98b6-28a9-446d-bc9b-ac7aad58be7d nodeName:}" failed. No retries permitted until 2026-02-17 09:18:45.855296443 +0000 UTC m=+803.398552089 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a2de98b6-28a9-446d-bc9b-ac7aad58be7d-cert") pod "infra-operator-controller-manager-79d975b745-dlmq4" (UID: "a2de98b6-28a9-446d-bc9b-ac7aad58be7d") : secret "infra-operator-webhook-server-cert" not found Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.873609 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ckggs" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.873707 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw62j\" (UniqueName: \"kubernetes.io/projected/bced4dcb-9bee-42a5-9e52-e6ddc83f8f06-kube-api-access-qw62j\") pod \"telemetry-operator-controller-manager-7f45b4ff68-jcd2x\" (UID: \"bced4dcb-9bee-42a5-9e52-e6ddc83f8f06\") " pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-jcd2x" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.893112 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-mltd5" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.921554 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-jcd2x" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.958287 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-webhook-certs\") pod \"openstack-operator-controller-manager-74d597bfd6-4ptpp\" (UID: \"32b36aa1-7151-443d-9091-bc1e8ea86805\") " pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.958317 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-metrics-certs\") pod \"openstack-operator-controller-manager-74d597bfd6-4ptpp\" (UID: \"32b36aa1-7151-443d-9091-bc1e8ea86805\") " pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.958388 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88xnb\" (UniqueName: \"kubernetes.io/projected/32b36aa1-7151-443d-9091-bc1e8ea86805-kube-api-access-88xnb\") pod \"openstack-operator-controller-manager-74d597bfd6-4ptpp\" (UID: \"32b36aa1-7151-443d-9091-bc1e8ea86805\") " pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.968381 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57pps\" (UniqueName: \"kubernetes.io/projected/632a67ba-e5ae-43bb-a69e-49cf64c054e4-kube-api-access-57pps\") pod \"test-operator-controller-manager-7866795846-4phd8\" (UID: \"632a67ba-e5ae-43bb-a69e-49cf64c054e4\") " pod="openstack-operators/test-operator-controller-manager-7866795846-4phd8" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.968640 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tszw2\" (UniqueName: \"kubernetes.io/projected/a150a634-4cfd-4d77-ada7-5ab1f65a8985-kube-api-access-tszw2\") pod \"rabbitmq-cluster-operator-manager-668c99d594-hhb2h\" (UID: \"a150a634-4cfd-4d77-ada7-5ab1f65a8985\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhb2h" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.968732 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bft9\" (UniqueName: \"kubernetes.io/projected/e72f9717-510f-4f9e-8557-ccd69b4dc61c-kube-api-access-5bft9\") pod \"watcher-operator-controller-manager-5db88f68c-7swkp\" (UID: \"e72f9717-510f-4f9e-8557-ccd69b4dc61c\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-7swkp" Feb 17 09:18:44 crc kubenswrapper[4848]: I0217 09:18:44.990807 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57pps\" (UniqueName: \"kubernetes.io/projected/632a67ba-e5ae-43bb-a69e-49cf64c054e4-kube-api-access-57pps\") pod \"test-operator-controller-manager-7866795846-4phd8\" (UID: \"632a67ba-e5ae-43bb-a69e-49cf64c054e4\") " pod="openstack-operators/test-operator-controller-manager-7866795846-4phd8" Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.023682 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7866795846-4phd8" Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.070046 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f658d1a9-916e-41c9-8268-e94c22c6a045-cert\") pod \"openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr\" (UID: \"f658d1a9-916e-41c9-8268-e94c22c6a045\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr" Feb 17 09:18:45 crc kubenswrapper[4848]: E0217 09:18:45.070259 4848 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 09:18:45 crc kubenswrapper[4848]: E0217 09:18:45.070350 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f658d1a9-916e-41c9-8268-e94c22c6a045-cert podName:f658d1a9-916e-41c9-8268-e94c22c6a045 nodeName:}" failed. No retries permitted until 2026-02-17 09:18:46.070327619 +0000 UTC m=+803.613583265 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f658d1a9-916e-41c9-8268-e94c22c6a045-cert") pod "openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr" (UID: "f658d1a9-916e-41c9-8268-e94c22c6a045") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.070157 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88xnb\" (UniqueName: \"kubernetes.io/projected/32b36aa1-7151-443d-9091-bc1e8ea86805-kube-api-access-88xnb\") pod \"openstack-operator-controller-manager-74d597bfd6-4ptpp\" (UID: \"32b36aa1-7151-443d-9091-bc1e8ea86805\") " pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.070594 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tszw2\" (UniqueName: \"kubernetes.io/projected/a150a634-4cfd-4d77-ada7-5ab1f65a8985-kube-api-access-tszw2\") pod \"rabbitmq-cluster-operator-manager-668c99d594-hhb2h\" (UID: \"a150a634-4cfd-4d77-ada7-5ab1f65a8985\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhb2h" Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.070689 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bft9\" (UniqueName: \"kubernetes.io/projected/e72f9717-510f-4f9e-8557-ccd69b4dc61c-kube-api-access-5bft9\") pod \"watcher-operator-controller-manager-5db88f68c-7swkp\" (UID: \"e72f9717-510f-4f9e-8557-ccd69b4dc61c\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-7swkp" Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.070816 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-webhook-certs\") pod \"openstack-operator-controller-manager-74d597bfd6-4ptpp\" (UID: \"32b36aa1-7151-443d-9091-bc1e8ea86805\") " pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.070850 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-metrics-certs\") pod \"openstack-operator-controller-manager-74d597bfd6-4ptpp\" (UID: \"32b36aa1-7151-443d-9091-bc1e8ea86805\") " pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" Feb 17 09:18:45 crc kubenswrapper[4848]: E0217 09:18:45.070933 4848 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 09:18:45 crc kubenswrapper[4848]: E0217 09:18:45.070995 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-webhook-certs podName:32b36aa1-7151-443d-9091-bc1e8ea86805 nodeName:}" failed. No retries permitted until 2026-02-17 09:18:45.570975698 +0000 UTC m=+803.114231414 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-webhook-certs") pod "openstack-operator-controller-manager-74d597bfd6-4ptpp" (UID: "32b36aa1-7151-443d-9091-bc1e8ea86805") : secret "webhook-server-cert" not found Feb 17 09:18:45 crc kubenswrapper[4848]: E0217 09:18:45.071019 4848 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 09:18:45 crc kubenswrapper[4848]: E0217 09:18:45.071082 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-metrics-certs podName:32b36aa1-7151-443d-9091-bc1e8ea86805 nodeName:}" failed. No retries permitted until 2026-02-17 09:18:45.571066921 +0000 UTC m=+803.114322567 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-metrics-certs") pod "openstack-operator-controller-manager-74d597bfd6-4ptpp" (UID: "32b36aa1-7151-443d-9091-bc1e8ea86805") : secret "metrics-server-cert" not found Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.089087 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tszw2\" (UniqueName: \"kubernetes.io/projected/a150a634-4cfd-4d77-ada7-5ab1f65a8985-kube-api-access-tszw2\") pod \"rabbitmq-cluster-operator-manager-668c99d594-hhb2h\" (UID: \"a150a634-4cfd-4d77-ada7-5ab1f65a8985\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhb2h" Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.089876 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88xnb\" (UniqueName: \"kubernetes.io/projected/32b36aa1-7151-443d-9091-bc1e8ea86805-kube-api-access-88xnb\") pod \"openstack-operator-controller-manager-74d597bfd6-4ptpp\" (UID: \"32b36aa1-7151-443d-9091-bc1e8ea86805\") " pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.092043 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bft9\" (UniqueName: \"kubernetes.io/projected/e72f9717-510f-4f9e-8557-ccd69b4dc61c-kube-api-access-5bft9\") pod \"watcher-operator-controller-manager-5db88f68c-7swkp\" (UID: \"e72f9717-510f-4f9e-8557-ccd69b4dc61c\") " pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-7swkp" Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.211293 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-7swkp" Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.234073 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhb2h" Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.322894 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-86z5g"] Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.334282 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987464f4-958lw"] Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.334334 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-5d946d989d-wlll6"] Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.365413 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-vqb7z" event={"ID":"cf85b89f-2556-4ee7-a12b-6a4379f962e9","Type":"ContainerStarted","Data":"04efa528cbca1c4f69ad1b2d590985035bf04badea22cd3fffd0a6fc2eefd716"} Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.366532 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-958lw" event={"ID":"17a4dcbd-4735-48d6-a575-f7d3af6843f1","Type":"ContainerStarted","Data":"37861c0e92724cbab4fb53e72e6fdff64e60e58395c76dcb37ef57e48f92ca69"} Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.457919 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5lpnr"] Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.481741 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-fbwmm"] Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.580689 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-webhook-certs\") pod \"openstack-operator-controller-manager-74d597bfd6-4ptpp\" (UID: \"32b36aa1-7151-443d-9091-bc1e8ea86805\") " pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.580884 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-metrics-certs\") pod \"openstack-operator-controller-manager-74d597bfd6-4ptpp\" (UID: \"32b36aa1-7151-443d-9091-bc1e8ea86805\") " pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" Feb 17 09:18:45 crc kubenswrapper[4848]: E0217 09:18:45.581009 4848 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 09:18:45 crc kubenswrapper[4848]: E0217 09:18:45.581083 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-webhook-certs podName:32b36aa1-7151-443d-9091-bc1e8ea86805 nodeName:}" failed. No retries permitted until 2026-02-17 09:18:46.581061121 +0000 UTC m=+804.124316767 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-webhook-certs") pod "openstack-operator-controller-manager-74d597bfd6-4ptpp" (UID: "32b36aa1-7151-443d-9091-bc1e8ea86805") : secret "webhook-server-cert" not found Feb 17 09:18:45 crc kubenswrapper[4848]: E0217 09:18:45.581187 4848 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 09:18:45 crc kubenswrapper[4848]: E0217 09:18:45.581298 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-metrics-certs podName:32b36aa1-7151-443d-9091-bc1e8ea86805 nodeName:}" failed. No retries permitted until 2026-02-17 09:18:46.581273297 +0000 UTC m=+804.124528943 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-metrics-certs") pod "openstack-operator-controller-manager-74d597bfd6-4ptpp" (UID: "32b36aa1-7151-443d-9091-bc1e8ea86805") : secret "metrics-server-cert" not found Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.886529 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2de98b6-28a9-446d-bc9b-ac7aad58be7d-cert\") pod \"infra-operator-controller-manager-79d975b745-dlmq4\" (UID: \"a2de98b6-28a9-446d-bc9b-ac7aad58be7d\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dlmq4" Feb 17 09:18:45 crc kubenswrapper[4848]: E0217 09:18:45.886714 4848 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 09:18:45 crc kubenswrapper[4848]: E0217 09:18:45.886804 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2de98b6-28a9-446d-bc9b-ac7aad58be7d-cert podName:a2de98b6-28a9-446d-bc9b-ac7aad58be7d nodeName:}" failed. No retries permitted until 2026-02-17 09:18:47.886785102 +0000 UTC m=+805.430040748 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a2de98b6-28a9-446d-bc9b-ac7aad58be7d-cert") pod "infra-operator-controller-manager-79d975b745-dlmq4" (UID: "a2de98b6-28a9-446d-bc9b-ac7aad58be7d") : secret "infra-operator-webhook-server-cert" not found Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.929535 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-9wvc8"] Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.956169 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-69f8888797-mcnbc"] Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.974727 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-rvzql"] Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.981966 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-d44cf6b75-ckggs"] Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.994164 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-ckxfq"] Feb 17 09:18:45 crc kubenswrapper[4848]: I0217 09:18:45.999178 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-mltd5"] Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.005415 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-lth6q"] Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.015725 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-54f6768c69-qd7ds"] Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.022272 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-sw5bf"] Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.030643 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64ddbf8bb-fkvjd"] Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.038869 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-5db88f68c-7swkp"] Feb 17 09:18:46 crc kubenswrapper[4848]: E0217 09:18:46.041279 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zrbw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-d44cf6b75-ckggs_openstack-operators(653da755-b43b-4da9-bfd9-e8ee0bb44cc4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 09:18:46 crc kubenswrapper[4848]: E0217 09:18:46.042583 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ckggs" podUID="653da755-b43b-4da9-bfd9-e8ee0bb44cc4" Feb 17 09:18:46 crc kubenswrapper[4848]: E0217 09:18:46.045911 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zzqpc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-8497b45c89-ckxfq_openstack-operators(3922bb1d-9f36-4ffc-b382-a54c1c213008): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 09:18:46 crc kubenswrapper[4848]: E0217 09:18:46.047127 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ckxfq" podUID="3922bb1d-9f36-4ffc-b382-a54c1c213008" Feb 17 09:18:46 crc kubenswrapper[4848]: E0217 09:18:46.048030 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5bft9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-5db88f68c-7swkp_openstack-operators(e72f9717-510f-4f9e-8557-ccd69b4dc61c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 09:18:46 crc kubenswrapper[4848]: E0217 09:18:46.049599 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-7swkp" podUID="e72f9717-510f-4f9e-8557-ccd69b4dc61c" Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.052400 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhb2h"] Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.058164 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7866795846-4phd8"] Feb 17 09:18:46 crc kubenswrapper[4848]: E0217 09:18:46.060949 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qw62j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-7f45b4ff68-jcd2x_openstack-operators(bced4dcb-9bee-42a5-9e52-e6ddc83f8f06): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 09:18:46 crc kubenswrapper[4848]: E0217 09:18:46.062322 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-jcd2x" podUID="bced4dcb-9bee-42a5-9e52-e6ddc83f8f06" Feb 17 09:18:46 crc kubenswrapper[4848]: E0217 09:18:46.066981 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tszw2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-hhb2h_openstack-operators(a150a634-4cfd-4d77-ada7-5ab1f65a8985): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.067706 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-jcd2x"] Feb 17 09:18:46 crc kubenswrapper[4848]: E0217 09:18:46.067931 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-57pps,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7866795846-4phd8_openstack-operators(632a67ba-e5ae-43bb-a69e-49cf64c054e4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 17 09:18:46 crc kubenswrapper[4848]: E0217 09:18:46.068070 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhb2h" podUID="a150a634-4cfd-4d77-ada7-5ab1f65a8985" Feb 17 09:18:46 crc kubenswrapper[4848]: E0217 09:18:46.069750 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7866795846-4phd8" podUID="632a67ba-e5ae-43bb-a69e-49cf64c054e4" Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.088948 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f658d1a9-916e-41c9-8268-e94c22c6a045-cert\") pod \"openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr\" (UID: \"f658d1a9-916e-41c9-8268-e94c22c6a045\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr" Feb 17 09:18:46 crc kubenswrapper[4848]: E0217 09:18:46.089176 4848 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 09:18:46 crc kubenswrapper[4848]: E0217 09:18:46.089243 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f658d1a9-916e-41c9-8268-e94c22c6a045-cert podName:f658d1a9-916e-41c9-8268-e94c22c6a045 nodeName:}" failed. No retries permitted until 2026-02-17 09:18:48.089224748 +0000 UTC m=+805.632480394 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f658d1a9-916e-41c9-8268-e94c22c6a045-cert") pod "openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr" (UID: "f658d1a9-916e-41c9-8268-e94c22c6a045") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.374074 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-mcnbc" event={"ID":"aebf2b53-aeb7-44a1-a2c2-3dfaa0e01bdd","Type":"ContainerStarted","Data":"61a874c606f1f1993c9f3fdcda5b4a212f3fea516068a9522c89065a312a6113"} Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.375538 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ckggs" event={"ID":"653da755-b43b-4da9-bfd9-e8ee0bb44cc4","Type":"ContainerStarted","Data":"5ad6c3ece37ebcf94ea40eaf4ee5cafe795dff6dcc93df6cecfb5d64d3ed69c5"} Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.378714 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-86z5g" event={"ID":"32c32c38-9ebf-4e9a-bea8-e761159dda5f","Type":"ContainerStarted","Data":"dd2a1bbfc13fc427c64b377a778de225533c66fd2f8b63dbc6ed6a59e8bbdeef"} Feb 17 09:18:46 crc kubenswrapper[4848]: E0217 09:18:46.379470 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ckggs" podUID="653da755-b43b-4da9-bfd9-e8ee0bb44cc4" Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.395790 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-mltd5" event={"ID":"6b8d9b10-d577-4621-88d4-6f26e692a502","Type":"ContainerStarted","Data":"b0c3c9a6c1bdba6925ae6eafd2e0d950c14154d012de18cc3172f28129425217"} Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.400930 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-7swkp" event={"ID":"e72f9717-510f-4f9e-8557-ccd69b4dc61c","Type":"ContainerStarted","Data":"529db4bb00c97fe76ad9ba19f7e9e5101a5376badad5084a94e87dd76f5c776a"} Feb 17 09:18:46 crc kubenswrapper[4848]: E0217 09:18:46.404233 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-7swkp" podUID="e72f9717-510f-4f9e-8557-ccd69b4dc61c" Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.416071 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ckxfq" event={"ID":"3922bb1d-9f36-4ffc-b382-a54c1c213008","Type":"ContainerStarted","Data":"fc1fb1e887477fc4f5a9cb69e360a928520c137739eef43ebce53f7d9a153d03"} Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.417943 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-jcd2x" event={"ID":"bced4dcb-9bee-42a5-9e52-e6ddc83f8f06","Type":"ContainerStarted","Data":"de0ebba37435a65eb9a940dafb13cde50e400745e8c663d8f0f2b5c6bbd3b7e6"} Feb 17 09:18:46 crc kubenswrapper[4848]: E0217 09:18:46.418835 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-jcd2x" podUID="bced4dcb-9bee-42a5-9e52-e6ddc83f8f06" Feb 17 09:18:46 crc kubenswrapper[4848]: E0217 09:18:46.420894 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ckxfq" podUID="3922bb1d-9f36-4ffc-b382-a54c1c213008" Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.429185 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-rvzql" event={"ID":"88153939-7ca7-448d-a21c-b8330360b5a1","Type":"ContainerStarted","Data":"d87ff33be13d134b00c5f0028555c1193d3e75598c21c19431f1db2042a66792"} Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.432280 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-lth6q" event={"ID":"f151e0ea-ac05-426d-aa94-e32cc25fdc09","Type":"ContainerStarted","Data":"b798ee9909bf46e8e955f7f48b7922d3ef475cde6d30ced637cdd5c0ecd88cfd"} Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.433619 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-fbwmm" event={"ID":"05876a75-9b3e-45b7-a3fe-89ab569742fd","Type":"ContainerStarted","Data":"100ef72cb554976a4acb27ffe099e5a651985b072be45c12e80153ebf2f808c1"} Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.435592 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-sw5bf" event={"ID":"04f9fe37-de58-4b62-896e-0945a7bcbfdf","Type":"ContainerStarted","Data":"bb457830570cc6d8642f8fb38bbea110aba4d9388bd73e90a7037f0f1ae84abc"} Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.436521 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-fkvjd" event={"ID":"aa45cd11-5d86-47c3-b46e-15c0b204feb6","Type":"ContainerStarted","Data":"2a87e79d95e75773daba679f4c0bcc68a0179d1f98f2e7f22879ee5e8a6d7ea0"} Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.442801 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5lpnr" event={"ID":"ce2d3288-2b7d-4db8-861d-0a413fc90222","Type":"ContainerStarted","Data":"ab3975cc62feafd8e1999706982e6a3ec1f02ea7c8b32be754df970ba60551b0"} Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.447193 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-4phd8" event={"ID":"632a67ba-e5ae-43bb-a69e-49cf64c054e4","Type":"ContainerStarted","Data":"a17572a7762b376e2d54b277bf7965fc03ba8babf243a5efd53d902d67b77cf4"} Feb 17 09:18:46 crc kubenswrapper[4848]: E0217 09:18:46.448285 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-4phd8" podUID="632a67ba-e5ae-43bb-a69e-49cf64c054e4" Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.450170 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-9wvc8" event={"ID":"97430748-300a-434e-a6b3-52274422ab66","Type":"ContainerStarted","Data":"639f09b7e71ebe5a23c73c36d92938a7ee5551a213f95374d7782da57da2908c"} Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.452347 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhb2h" event={"ID":"a150a634-4cfd-4d77-ada7-5ab1f65a8985","Type":"ContainerStarted","Data":"5503a6afebbb1ce6fa7ac0d6e8907d92661f087c5a66cd90b66d354ed84f78a8"} Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.453528 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-qd7ds" event={"ID":"e6251943-952f-4cbc-924c-b362d9f7c8da","Type":"ContainerStarted","Data":"662729d92f9edeeccf7715c71a24d55bde18f0fc5149b021f4474e273b981bfa"} Feb 17 09:18:46 crc kubenswrapper[4848]: E0217 09:18:46.454120 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhb2h" podUID="a150a634-4cfd-4d77-ada7-5ab1f65a8985" Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.459515 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-wlll6" event={"ID":"b2e407ed-c962-4fcf-b367-f4164d644de6","Type":"ContainerStarted","Data":"6b9e7793b1ca87a08ba5066cabcee06ff9aeab39e7880a76e5199654ea4bd5a3"} Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.600258 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-webhook-certs\") pod \"openstack-operator-controller-manager-74d597bfd6-4ptpp\" (UID: \"32b36aa1-7151-443d-9091-bc1e8ea86805\") " pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" Feb 17 09:18:46 crc kubenswrapper[4848]: I0217 09:18:46.600321 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-metrics-certs\") pod \"openstack-operator-controller-manager-74d597bfd6-4ptpp\" (UID: \"32b36aa1-7151-443d-9091-bc1e8ea86805\") " pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" Feb 17 09:18:46 crc kubenswrapper[4848]: E0217 09:18:46.600553 4848 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 09:18:46 crc kubenswrapper[4848]: E0217 09:18:46.600609 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-metrics-certs podName:32b36aa1-7151-443d-9091-bc1e8ea86805 nodeName:}" failed. No retries permitted until 2026-02-17 09:18:48.600591689 +0000 UTC m=+806.143847335 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-metrics-certs") pod "openstack-operator-controller-manager-74d597bfd6-4ptpp" (UID: "32b36aa1-7151-443d-9091-bc1e8ea86805") : secret "metrics-server-cert" not found Feb 17 09:18:46 crc kubenswrapper[4848]: E0217 09:18:46.601041 4848 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 09:18:46 crc kubenswrapper[4848]: E0217 09:18:46.601079 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-webhook-certs podName:32b36aa1-7151-443d-9091-bc1e8ea86805 nodeName:}" failed. No retries permitted until 2026-02-17 09:18:48.601068863 +0000 UTC m=+806.144324509 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-webhook-certs") pod "openstack-operator-controller-manager-74d597bfd6-4ptpp" (UID: "32b36aa1-7151-443d-9091-bc1e8ea86805") : secret "webhook-server-cert" not found Feb 17 09:18:47 crc kubenswrapper[4848]: E0217 09:18:47.485266 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:543c103838f3e6ef48755665a7695dfa3ed84753c557560257d265db31f92759\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ckggs" podUID="653da755-b43b-4da9-bfd9-e8ee0bb44cc4" Feb 17 09:18:47 crc kubenswrapper[4848]: E0217 09:18:47.485266 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:f0fabdf79095def0f8b1c0442925548a94ca94bed4de2d3b171277129f8079e6\\\"\"" pod="openstack-operators/test-operator-controller-manager-7866795846-4phd8" podUID="632a67ba-e5ae-43bb-a69e-49cf64c054e4" Feb 17 09:18:47 crc kubenswrapper[4848]: E0217 09:18:47.485288 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a57336b9f95b703f80453db87e43a2834ca1bdc89480796d28ebbe0a9702ecfd\\\"\"" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ckxfq" podUID="3922bb1d-9f36-4ffc-b382-a54c1c213008" Feb 17 09:18:47 crc kubenswrapper[4848]: E0217 09:18:47.485332 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:66a4b9322ebb573313178ea88e31026d4532f461592b9fae2dff71efd9256d99\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-jcd2x" podUID="bced4dcb-9bee-42a5-9e52-e6ddc83f8f06" Feb 17 09:18:47 crc kubenswrapper[4848]: E0217 09:18:47.485345 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhb2h" podUID="a150a634-4cfd-4d77-ada7-5ab1f65a8985" Feb 17 09:18:47 crc kubenswrapper[4848]: E0217 09:18:47.485388 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d01ae848290e880c09127d5297418dea40fc7f090fdab9bf2c578c7e7f53aec0\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-7swkp" podUID="e72f9717-510f-4f9e-8557-ccd69b4dc61c" Feb 17 09:18:47 crc kubenswrapper[4848]: I0217 09:18:47.923297 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2de98b6-28a9-446d-bc9b-ac7aad58be7d-cert\") pod \"infra-operator-controller-manager-79d975b745-dlmq4\" (UID: \"a2de98b6-28a9-446d-bc9b-ac7aad58be7d\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dlmq4" Feb 17 09:18:47 crc kubenswrapper[4848]: E0217 09:18:47.923440 4848 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 09:18:47 crc kubenswrapper[4848]: E0217 09:18:47.923495 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2de98b6-28a9-446d-bc9b-ac7aad58be7d-cert podName:a2de98b6-28a9-446d-bc9b-ac7aad58be7d nodeName:}" failed. No retries permitted until 2026-02-17 09:18:51.923480167 +0000 UTC m=+809.466735813 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a2de98b6-28a9-446d-bc9b-ac7aad58be7d-cert") pod "infra-operator-controller-manager-79d975b745-dlmq4" (UID: "a2de98b6-28a9-446d-bc9b-ac7aad58be7d") : secret "infra-operator-webhook-server-cert" not found Feb 17 09:18:48 crc kubenswrapper[4848]: I0217 09:18:48.126779 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f658d1a9-916e-41c9-8268-e94c22c6a045-cert\") pod \"openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr\" (UID: \"f658d1a9-916e-41c9-8268-e94c22c6a045\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr" Feb 17 09:18:48 crc kubenswrapper[4848]: E0217 09:18:48.126956 4848 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 09:18:48 crc kubenswrapper[4848]: E0217 09:18:48.127031 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f658d1a9-916e-41c9-8268-e94c22c6a045-cert podName:f658d1a9-916e-41c9-8268-e94c22c6a045 nodeName:}" failed. No retries permitted until 2026-02-17 09:18:52.127013475 +0000 UTC m=+809.670269121 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f658d1a9-916e-41c9-8268-e94c22c6a045-cert") pod "openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr" (UID: "f658d1a9-916e-41c9-8268-e94c22c6a045") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 09:18:48 crc kubenswrapper[4848]: I0217 09:18:48.635728 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-webhook-certs\") pod \"openstack-operator-controller-manager-74d597bfd6-4ptpp\" (UID: \"32b36aa1-7151-443d-9091-bc1e8ea86805\") " pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" Feb 17 09:18:48 crc kubenswrapper[4848]: I0217 09:18:48.635816 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-metrics-certs\") pod \"openstack-operator-controller-manager-74d597bfd6-4ptpp\" (UID: \"32b36aa1-7151-443d-9091-bc1e8ea86805\") " pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" Feb 17 09:18:48 crc kubenswrapper[4848]: E0217 09:18:48.635936 4848 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 09:18:48 crc kubenswrapper[4848]: E0217 09:18:48.636015 4848 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 09:18:48 crc kubenswrapper[4848]: E0217 09:18:48.636064 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-webhook-certs podName:32b36aa1-7151-443d-9091-bc1e8ea86805 nodeName:}" failed. No retries permitted until 2026-02-17 09:18:52.636037388 +0000 UTC m=+810.179293104 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-webhook-certs") pod "openstack-operator-controller-manager-74d597bfd6-4ptpp" (UID: "32b36aa1-7151-443d-9091-bc1e8ea86805") : secret "webhook-server-cert" not found Feb 17 09:18:48 crc kubenswrapper[4848]: E0217 09:18:48.636098 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-metrics-certs podName:32b36aa1-7151-443d-9091-bc1e8ea86805 nodeName:}" failed. No retries permitted until 2026-02-17 09:18:52.636075479 +0000 UTC m=+810.179331135 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-metrics-certs") pod "openstack-operator-controller-manager-74d597bfd6-4ptpp" (UID: "32b36aa1-7151-443d-9091-bc1e8ea86805") : secret "metrics-server-cert" not found Feb 17 09:18:48 crc kubenswrapper[4848]: I0217 09:18:48.772068 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:18:48 crc kubenswrapper[4848]: I0217 09:18:48.772218 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:18:51 crc kubenswrapper[4848]: I0217 09:18:51.991743 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2de98b6-28a9-446d-bc9b-ac7aad58be7d-cert\") pod \"infra-operator-controller-manager-79d975b745-dlmq4\" (UID: \"a2de98b6-28a9-446d-bc9b-ac7aad58be7d\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dlmq4" Feb 17 09:18:51 crc kubenswrapper[4848]: E0217 09:18:51.991941 4848 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 17 09:18:51 crc kubenswrapper[4848]: E0217 09:18:51.992034 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a2de98b6-28a9-446d-bc9b-ac7aad58be7d-cert podName:a2de98b6-28a9-446d-bc9b-ac7aad58be7d nodeName:}" failed. No retries permitted until 2026-02-17 09:18:59.992007695 +0000 UTC m=+817.535263371 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a2de98b6-28a9-446d-bc9b-ac7aad58be7d-cert") pod "infra-operator-controller-manager-79d975b745-dlmq4" (UID: "a2de98b6-28a9-446d-bc9b-ac7aad58be7d") : secret "infra-operator-webhook-server-cert" not found Feb 17 09:18:52 crc kubenswrapper[4848]: I0217 09:18:52.196455 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f658d1a9-916e-41c9-8268-e94c22c6a045-cert\") pod \"openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr\" (UID: \"f658d1a9-916e-41c9-8268-e94c22c6a045\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr" Feb 17 09:18:52 crc kubenswrapper[4848]: E0217 09:18:52.196728 4848 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 09:18:52 crc kubenswrapper[4848]: E0217 09:18:52.197251 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f658d1a9-916e-41c9-8268-e94c22c6a045-cert podName:f658d1a9-916e-41c9-8268-e94c22c6a045 nodeName:}" failed. No retries permitted until 2026-02-17 09:19:00.197226003 +0000 UTC m=+817.740481649 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f658d1a9-916e-41c9-8268-e94c22c6a045-cert") pod "openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr" (UID: "f658d1a9-916e-41c9-8268-e94c22c6a045") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 09:18:52 crc kubenswrapper[4848]: I0217 09:18:52.704117 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-webhook-certs\") pod \"openstack-operator-controller-manager-74d597bfd6-4ptpp\" (UID: \"32b36aa1-7151-443d-9091-bc1e8ea86805\") " pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" Feb 17 09:18:52 crc kubenswrapper[4848]: I0217 09:18:52.704178 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-metrics-certs\") pod \"openstack-operator-controller-manager-74d597bfd6-4ptpp\" (UID: \"32b36aa1-7151-443d-9091-bc1e8ea86805\") " pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" Feb 17 09:18:52 crc kubenswrapper[4848]: E0217 09:18:52.704418 4848 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 09:18:52 crc kubenswrapper[4848]: E0217 09:18:52.704474 4848 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 09:18:52 crc kubenswrapper[4848]: E0217 09:18:52.704538 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-webhook-certs podName:32b36aa1-7151-443d-9091-bc1e8ea86805 nodeName:}" failed. No retries permitted until 2026-02-17 09:19:00.704508814 +0000 UTC m=+818.247764490 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-webhook-certs") pod "openstack-operator-controller-manager-74d597bfd6-4ptpp" (UID: "32b36aa1-7151-443d-9091-bc1e8ea86805") : secret "webhook-server-cert" not found Feb 17 09:18:52 crc kubenswrapper[4848]: E0217 09:18:52.704575 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-metrics-certs podName:32b36aa1-7151-443d-9091-bc1e8ea86805 nodeName:}" failed. No retries permitted until 2026-02-17 09:19:00.704562316 +0000 UTC m=+818.247818002 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-metrics-certs") pod "openstack-operator-controller-manager-74d597bfd6-4ptpp" (UID: "32b36aa1-7151-443d-9091-bc1e8ea86805") : secret "metrics-server-cert" not found Feb 17 09:18:58 crc kubenswrapper[4848]: E0217 09:18:58.467885 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:2b8ab3063af4aaeed0198197aae6f391c6647ac686c94c85668537f1d5933979" Feb 17 09:18:58 crc kubenswrapper[4848]: E0217 09:18:58.469081 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:2b8ab3063af4aaeed0198197aae6f391c6647ac686c94c85668537f1d5933979,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-82gx2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-5d946d989d-wlll6_openstack-operators(b2e407ed-c962-4fcf-b367-f4164d644de6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 09:18:58 crc kubenswrapper[4848]: E0217 09:18:58.470341 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-wlll6" podUID="b2e407ed-c962-4fcf-b367-f4164d644de6" Feb 17 09:18:58 crc kubenswrapper[4848]: E0217 09:18:58.564950 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:2b8ab3063af4aaeed0198197aae6f391c6647ac686c94c85668537f1d5933979\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-wlll6" podUID="b2e407ed-c962-4fcf-b367-f4164d644de6" Feb 17 09:18:59 crc kubenswrapper[4848]: E0217 09:18:59.070885 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1" Feb 17 09:18:59 crc kubenswrapper[4848]: E0217 09:18:59.071354 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cmv22,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-9wvc8_openstack-operators(97430748-300a-434e-a6b3-52274422ab66): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 09:18:59 crc kubenswrapper[4848]: E0217 09:18:59.072849 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-9wvc8" podUID="97430748-300a-434e-a6b3-52274422ab66" Feb 17 09:18:59 crc kubenswrapper[4848]: E0217 09:18:59.570424 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-9wvc8" podUID="97430748-300a-434e-a6b3-52274422ab66" Feb 17 09:18:59 crc kubenswrapper[4848]: E0217 09:18:59.624197 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 17 09:18:59 crc kubenswrapper[4848]: E0217 09:18:59.624367 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pc5gx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-rvzql_openstack-operators(88153939-7ca7-448d-a21c-b8330360b5a1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 09:18:59 crc kubenswrapper[4848]: E0217 09:18:59.625691 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-rvzql" podUID="88153939-7ca7-448d-a21c-b8330360b5a1" Feb 17 09:19:00 crc kubenswrapper[4848]: I0217 09:19:00.015207 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2de98b6-28a9-446d-bc9b-ac7aad58be7d-cert\") pod \"infra-operator-controller-manager-79d975b745-dlmq4\" (UID: \"a2de98b6-28a9-446d-bc9b-ac7aad58be7d\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dlmq4" Feb 17 09:19:00 crc kubenswrapper[4848]: I0217 09:19:00.023033 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a2de98b6-28a9-446d-bc9b-ac7aad58be7d-cert\") pod \"infra-operator-controller-manager-79d975b745-dlmq4\" (UID: \"a2de98b6-28a9-446d-bc9b-ac7aad58be7d\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dlmq4" Feb 17 09:19:00 crc kubenswrapper[4848]: I0217 09:19:00.117640 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-dlmq4" Feb 17 09:19:00 crc kubenswrapper[4848]: I0217 09:19:00.217699 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f658d1a9-916e-41c9-8268-e94c22c6a045-cert\") pod \"openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr\" (UID: \"f658d1a9-916e-41c9-8268-e94c22c6a045\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr" Feb 17 09:19:00 crc kubenswrapper[4848]: E0217 09:19:00.217884 4848 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 09:19:00 crc kubenswrapper[4848]: E0217 09:19:00.218034 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f658d1a9-916e-41c9-8268-e94c22c6a045-cert podName:f658d1a9-916e-41c9-8268-e94c22c6a045 nodeName:}" failed. No retries permitted until 2026-02-17 09:19:16.218016935 +0000 UTC m=+833.761272581 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f658d1a9-916e-41c9-8268-e94c22c6a045-cert") pod "openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr" (UID: "f658d1a9-916e-41c9-8268-e94c22c6a045") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 17 09:19:00 crc kubenswrapper[4848]: I0217 09:19:00.478518 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-dlmq4"] Feb 17 09:19:00 crc kubenswrapper[4848]: W0217 09:19:00.481130 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2de98b6_28a9_446d_bc9b_ac7aad58be7d.slice/crio-1133129db51962b0392e8559f4bd5af7ce363f34955c2649a55633a257c352b3 WatchSource:0}: Error finding container 1133129db51962b0392e8559f4bd5af7ce363f34955c2649a55633a257c352b3: Status 404 returned error can't find the container with id 1133129db51962b0392e8559f4bd5af7ce363f34955c2649a55633a257c352b3 Feb 17 09:19:00 crc kubenswrapper[4848]: I0217 09:19:00.579144 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-lth6q" event={"ID":"f151e0ea-ac05-426d-aa94-e32cc25fdc09","Type":"ContainerStarted","Data":"dc868b2eb7147f13a8c78c22a9e4d0f9cf97be195f6fc9f23341c0f9b58349da"} Feb 17 09:19:00 crc kubenswrapper[4848]: I0217 09:19:00.579509 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-lth6q" Feb 17 09:19:00 crc kubenswrapper[4848]: I0217 09:19:00.580789 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-dlmq4" event={"ID":"a2de98b6-28a9-446d-bc9b-ac7aad58be7d","Type":"ContainerStarted","Data":"1133129db51962b0392e8559f4bd5af7ce363f34955c2649a55633a257c352b3"} Feb 17 09:19:00 crc kubenswrapper[4848]: E0217 09:19:00.583885 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-rvzql" podUID="88153939-7ca7-448d-a21c-b8330360b5a1" Feb 17 09:19:00 crc kubenswrapper[4848]: I0217 09:19:00.600018 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-lth6q" podStartSLOduration=2.9670747840000002 podStartE2EDuration="16.599993516s" podCreationTimestamp="2026-02-17 09:18:44 +0000 UTC" firstStartedPulling="2026-02-17 09:18:46.020019725 +0000 UTC m=+803.563275371" lastFinishedPulling="2026-02-17 09:18:59.652938457 +0000 UTC m=+817.196194103" observedRunningTime="2026-02-17 09:19:00.59436719 +0000 UTC m=+818.137622836" watchObservedRunningTime="2026-02-17 09:19:00.599993516 +0000 UTC m=+818.143249182" Feb 17 09:19:00 crc kubenswrapper[4848]: I0217 09:19:00.726575 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-webhook-certs\") pod \"openstack-operator-controller-manager-74d597bfd6-4ptpp\" (UID: \"32b36aa1-7151-443d-9091-bc1e8ea86805\") " pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" Feb 17 09:19:00 crc kubenswrapper[4848]: I0217 09:19:00.726623 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-metrics-certs\") pod \"openstack-operator-controller-manager-74d597bfd6-4ptpp\" (UID: \"32b36aa1-7151-443d-9091-bc1e8ea86805\") " pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" Feb 17 09:19:00 crc kubenswrapper[4848]: E0217 09:19:00.727093 4848 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 17 09:19:00 crc kubenswrapper[4848]: E0217 09:19:00.727183 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-webhook-certs podName:32b36aa1-7151-443d-9091-bc1e8ea86805 nodeName:}" failed. No retries permitted until 2026-02-17 09:19:16.727168651 +0000 UTC m=+834.270424297 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-webhook-certs") pod "openstack-operator-controller-manager-74d597bfd6-4ptpp" (UID: "32b36aa1-7151-443d-9091-bc1e8ea86805") : secret "webhook-server-cert" not found Feb 17 09:19:00 crc kubenswrapper[4848]: E0217 09:19:00.727503 4848 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 17 09:19:00 crc kubenswrapper[4848]: E0217 09:19:00.727588 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-metrics-certs podName:32b36aa1-7151-443d-9091-bc1e8ea86805 nodeName:}" failed. No retries permitted until 2026-02-17 09:19:16.727570983 +0000 UTC m=+834.270826629 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-metrics-certs") pod "openstack-operator-controller-manager-74d597bfd6-4ptpp" (UID: "32b36aa1-7151-443d-9091-bc1e8ea86805") : secret "metrics-server-cert" not found Feb 17 09:19:03 crc kubenswrapper[4848]: I0217 09:19:03.617310 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-fkvjd" event={"ID":"aa45cd11-5d86-47c3-b46e-15c0b204feb6","Type":"ContainerStarted","Data":"e3b48173447717c9a70b4f48d23863c1766eb150398b3dba6e65abf204e3dd11"} Feb 17 09:19:03 crc kubenswrapper[4848]: I0217 09:19:03.625746 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987464f4-958lw" event={"ID":"17a4dcbd-4735-48d6-a575-f7d3af6843f1","Type":"ContainerStarted","Data":"1afac81026c91288d3405ed75b2c63d3b07cf8ab8e0e95ab997025e77ac92a40"} Feb 17 09:19:03 crc kubenswrapper[4848]: I0217 09:19:03.627487 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-qd7ds" event={"ID":"e6251943-952f-4cbc-924c-b362d9f7c8da","Type":"ContainerStarted","Data":"188efc75bb808eaefe8016717aa5b632c10a4a6d5a942b6710097182a9eed291"} Feb 17 09:19:03 crc kubenswrapper[4848]: I0217 09:19:03.627598 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-qd7ds" Feb 17 09:19:03 crc kubenswrapper[4848]: I0217 09:19:03.632464 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-fbwmm" event={"ID":"05876a75-9b3e-45b7-a3fe-89ab569742fd","Type":"ContainerStarted","Data":"ca67bec0b306d9ffb2623cd6e91a4f31b726610a93012029871ed39d3b279b7b"} Feb 17 09:19:03 crc kubenswrapper[4848]: I0217 09:19:03.632546 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-fbwmm" Feb 17 09:19:03 crc kubenswrapper[4848]: I0217 09:19:03.644469 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-mltd5" event={"ID":"6b8d9b10-d577-4621-88d4-6f26e692a502","Type":"ContainerStarted","Data":"a561251b2cb2c1905057a7df56df97b6d83d97d7e9aa3ad03aa618a6eacc94bc"} Feb 17 09:19:03 crc kubenswrapper[4848]: I0217 09:19:03.648304 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-sw5bf" event={"ID":"04f9fe37-de58-4b62-896e-0945a7bcbfdf","Type":"ContainerStarted","Data":"682ade95ffebab711beb022e05048e5a95981c136078459ac0d1bde615d9a16f"} Feb 17 09:19:03 crc kubenswrapper[4848]: I0217 09:19:03.649206 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-86z5g" event={"ID":"32c32c38-9ebf-4e9a-bea8-e761159dda5f","Type":"ContainerStarted","Data":"84fbc0ed482d385c17c43074fd11b06b49d7945dae01c31101471185bdbdc0ba"} Feb 17 09:19:03 crc kubenswrapper[4848]: I0217 09:19:03.654377 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5lpnr" event={"ID":"ce2d3288-2b7d-4db8-861d-0a413fc90222","Type":"ContainerStarted","Data":"80c570caa63d25543216a3b2e9c2deb74d98fee7860dae3bdb3a8c71233f1dab"} Feb 17 09:19:03 crc kubenswrapper[4848]: I0217 09:19:03.660488 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-vqb7z" event={"ID":"cf85b89f-2556-4ee7-a12b-6a4379f962e9","Type":"ContainerStarted","Data":"2b7174a7a8741b7dae4f02a3d358c803c026b012c34041de873e994d195b51d0"} Feb 17 09:19:03 crc kubenswrapper[4848]: I0217 09:19:03.661941 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-mcnbc" event={"ID":"aebf2b53-aeb7-44a1-a2c2-3dfaa0e01bdd","Type":"ContainerStarted","Data":"eaa29a7610b8e6e6f9b5acb53816dd3108954e6ec3e7d889878976bccc183c02"} Feb 17 09:19:03 crc kubenswrapper[4848]: I0217 09:19:03.701376 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-qd7ds" podStartSLOduration=6.128832497 podStartE2EDuration="19.701350654s" podCreationTimestamp="2026-02-17 09:18:44 +0000 UTC" firstStartedPulling="2026-02-17 09:18:46.020482249 +0000 UTC m=+803.563737895" lastFinishedPulling="2026-02-17 09:18:59.593000406 +0000 UTC m=+817.136256052" observedRunningTime="2026-02-17 09:19:03.673028192 +0000 UTC m=+821.216283838" watchObservedRunningTime="2026-02-17 09:19:03.701350654 +0000 UTC m=+821.244606300" Feb 17 09:19:03 crc kubenswrapper[4848]: I0217 09:19:03.704896 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-fbwmm" podStartSLOduration=6.592251156 podStartE2EDuration="20.704877948s" podCreationTimestamp="2026-02-17 09:18:43 +0000 UTC" firstStartedPulling="2026-02-17 09:18:45.489952985 +0000 UTC m=+803.033208631" lastFinishedPulling="2026-02-17 09:18:59.602579767 +0000 UTC m=+817.145835423" observedRunningTime="2026-02-17 09:19:03.702056605 +0000 UTC m=+821.245312261" watchObservedRunningTime="2026-02-17 09:19:03.704877948 +0000 UTC m=+821.248133614" Feb 17 09:19:04 crc kubenswrapper[4848]: I0217 09:19:04.671179 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-vqb7z" Feb 17 09:19:04 crc kubenswrapper[4848]: I0217 09:19:04.673240 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-mltd5" Feb 17 09:19:04 crc kubenswrapper[4848]: I0217 09:19:04.673981 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-fkvjd" Feb 17 09:19:04 crc kubenswrapper[4848]: I0217 09:19:04.674990 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-mcnbc" Feb 17 09:19:04 crc kubenswrapper[4848]: I0217 09:19:04.675488 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5lpnr" Feb 17 09:19:04 crc kubenswrapper[4848]: I0217 09:19:04.675991 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987464f4-958lw" Feb 17 09:19:04 crc kubenswrapper[4848]: I0217 09:19:04.676941 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-86z5g" Feb 17 09:19:04 crc kubenswrapper[4848]: I0217 09:19:04.676984 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-sw5bf" Feb 17 09:19:04 crc kubenswrapper[4848]: I0217 09:19:04.705855 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-sw5bf" podStartSLOduration=7.059871795 podStartE2EDuration="20.70583631s" podCreationTimestamp="2026-02-17 09:18:44 +0000 UTC" firstStartedPulling="2026-02-17 09:18:46.018785209 +0000 UTC m=+803.562040855" lastFinishedPulling="2026-02-17 09:18:59.664749724 +0000 UTC m=+817.208005370" observedRunningTime="2026-02-17 09:19:04.697690591 +0000 UTC m=+822.240946267" watchObservedRunningTime="2026-02-17 09:19:04.70583631 +0000 UTC m=+822.249091966" Feb 17 09:19:04 crc kubenswrapper[4848]: I0217 09:19:04.721387 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987464f4-958lw" podStartSLOduration=7.466814285 podStartE2EDuration="21.721368096s" podCreationTimestamp="2026-02-17 09:18:43 +0000 UTC" firstStartedPulling="2026-02-17 09:18:45.359324188 +0000 UTC m=+802.902579824" lastFinishedPulling="2026-02-17 09:18:59.613877989 +0000 UTC m=+817.157133635" observedRunningTime="2026-02-17 09:19:04.712353132 +0000 UTC m=+822.255608818" watchObservedRunningTime="2026-02-17 09:19:04.721368096 +0000 UTC m=+822.264623752" Feb 17 09:19:04 crc kubenswrapper[4848]: I0217 09:19:04.745452 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-mcnbc" podStartSLOduration=7.093558694 podStartE2EDuration="20.745428453s" podCreationTimestamp="2026-02-17 09:18:44 +0000 UTC" firstStartedPulling="2026-02-17 09:18:46.009846826 +0000 UTC m=+803.553102472" lastFinishedPulling="2026-02-17 09:18:59.661716585 +0000 UTC m=+817.204972231" observedRunningTime="2026-02-17 09:19:04.728389713 +0000 UTC m=+822.271645359" watchObservedRunningTime="2026-02-17 09:19:04.745428453 +0000 UTC m=+822.288684099" Feb 17 09:19:04 crc kubenswrapper[4848]: I0217 09:19:04.751522 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-fkvjd" podStartSLOduration=7.130600804 podStartE2EDuration="20.751502842s" podCreationTimestamp="2026-02-17 09:18:44 +0000 UTC" firstStartedPulling="2026-02-17 09:18:46.031946516 +0000 UTC m=+803.575202162" lastFinishedPulling="2026-02-17 09:18:59.652848554 +0000 UTC m=+817.196104200" observedRunningTime="2026-02-17 09:19:04.74497509 +0000 UTC m=+822.288230736" watchObservedRunningTime="2026-02-17 09:19:04.751502842 +0000 UTC m=+822.294758508" Feb 17 09:19:04 crc kubenswrapper[4848]: I0217 09:19:04.771711 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-86z5g" podStartSLOduration=6.512831138 podStartE2EDuration="20.771688885s" podCreationTimestamp="2026-02-17 09:18:44 +0000 UTC" firstStartedPulling="2026-02-17 09:18:45.409565194 +0000 UTC m=+802.952820840" lastFinishedPulling="2026-02-17 09:18:59.668422951 +0000 UTC m=+817.211678587" observedRunningTime="2026-02-17 09:19:04.757876869 +0000 UTC m=+822.301132515" watchObservedRunningTime="2026-02-17 09:19:04.771688885 +0000 UTC m=+822.314944531" Feb 17 09:19:04 crc kubenswrapper[4848]: I0217 09:19:04.777894 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-vqb7z" podStartSLOduration=7.04073846 podStartE2EDuration="21.777880307s" podCreationTimestamp="2026-02-17 09:18:43 +0000 UTC" firstStartedPulling="2026-02-17 09:18:44.857208149 +0000 UTC m=+802.400463795" lastFinishedPulling="2026-02-17 09:18:59.594349996 +0000 UTC m=+817.137605642" observedRunningTime="2026-02-17 09:19:04.77357827 +0000 UTC m=+822.316833916" watchObservedRunningTime="2026-02-17 09:19:04.777880307 +0000 UTC m=+822.321135953" Feb 17 09:19:04 crc kubenswrapper[4848]: I0217 09:19:04.799022 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5lpnr" podStartSLOduration=6.682301355 podStartE2EDuration="20.799004077s" podCreationTimestamp="2026-02-17 09:18:44 +0000 UTC" firstStartedPulling="2026-02-17 09:18:45.476305894 +0000 UTC m=+803.019561540" lastFinishedPulling="2026-02-17 09:18:59.593008616 +0000 UTC m=+817.136264262" observedRunningTime="2026-02-17 09:19:04.796946916 +0000 UTC m=+822.340202572" watchObservedRunningTime="2026-02-17 09:19:04.799004077 +0000 UTC m=+822.342259723" Feb 17 09:19:04 crc kubenswrapper[4848]: I0217 09:19:04.824057 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-mltd5" podStartSLOduration=7.202939618 podStartE2EDuration="20.824033652s" podCreationTimestamp="2026-02-17 09:18:44 +0000 UTC" firstStartedPulling="2026-02-17 09:18:46.03175286 +0000 UTC m=+803.575008506" lastFinishedPulling="2026-02-17 09:18:59.652846894 +0000 UTC m=+817.196102540" observedRunningTime="2026-02-17 09:19:04.816204552 +0000 UTC m=+822.359460198" watchObservedRunningTime="2026-02-17 09:19:04.824033652 +0000 UTC m=+822.367289298" Feb 17 09:19:10 crc kubenswrapper[4848]: I0217 09:19:10.713022 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-dlmq4" event={"ID":"a2de98b6-28a9-446d-bc9b-ac7aad58be7d","Type":"ContainerStarted","Data":"49fe73e9d1c43c840ebba66af9414e7bcc25c424d9e6163438be1c2d165eb55c"} Feb 17 09:19:10 crc kubenswrapper[4848]: I0217 09:19:10.713662 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-dlmq4" Feb 17 09:19:10 crc kubenswrapper[4848]: I0217 09:19:10.714891 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhb2h" event={"ID":"a150a634-4cfd-4d77-ada7-5ab1f65a8985","Type":"ContainerStarted","Data":"3f6e74303fbfa154565f1b3e352c500d048ccb12a893b962a275a43e1876c0ce"} Feb 17 09:19:10 crc kubenswrapper[4848]: I0217 09:19:10.716852 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ckggs" event={"ID":"653da755-b43b-4da9-bfd9-e8ee0bb44cc4","Type":"ContainerStarted","Data":"61c7606820dc6e5508077761f11cd3cfb58f5b5f4932dc1b59bb7b0a735341e8"} Feb 17 09:19:10 crc kubenswrapper[4848]: I0217 09:19:10.717095 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ckggs" Feb 17 09:19:10 crc kubenswrapper[4848]: I0217 09:19:10.718646 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-7swkp" event={"ID":"e72f9717-510f-4f9e-8557-ccd69b4dc61c","Type":"ContainerStarted","Data":"2a2d4b4eae1eedf0471c607c1ae0d95594c109b501de4bc6ac0ceaba31ee7575"} Feb 17 09:19:10 crc kubenswrapper[4848]: I0217 09:19:10.718826 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-7swkp" Feb 17 09:19:10 crc kubenswrapper[4848]: I0217 09:19:10.720587 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7866795846-4phd8" event={"ID":"632a67ba-e5ae-43bb-a69e-49cf64c054e4","Type":"ContainerStarted","Data":"4832d2d1099a4581cffab46cc51d3601d7d615d811c6861c24c6ddb2d430be3f"} Feb 17 09:19:10 crc kubenswrapper[4848]: I0217 09:19:10.720810 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7866795846-4phd8" Feb 17 09:19:10 crc kubenswrapper[4848]: I0217 09:19:10.722406 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ckxfq" event={"ID":"3922bb1d-9f36-4ffc-b382-a54c1c213008","Type":"ContainerStarted","Data":"492deb24f205095e7534bab2564c3f3f4f72f7d903cba21080d0243025424490"} Feb 17 09:19:10 crc kubenswrapper[4848]: I0217 09:19:10.722658 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ckxfq" Feb 17 09:19:10 crc kubenswrapper[4848]: I0217 09:19:10.725043 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-jcd2x" event={"ID":"bced4dcb-9bee-42a5-9e52-e6ddc83f8f06","Type":"ContainerStarted","Data":"78eadb0a46cb4485741fc1db3a0af0d682d24ab0ae44d705b904d51db5e98560"} Feb 17 09:19:10 crc kubenswrapper[4848]: I0217 09:19:10.725320 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-jcd2x" Feb 17 09:19:10 crc kubenswrapper[4848]: I0217 09:19:10.738349 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-dlmq4" podStartSLOduration=17.898972832 podStartE2EDuration="26.738331388s" podCreationTimestamp="2026-02-17 09:18:44 +0000 UTC" firstStartedPulling="2026-02-17 09:19:00.482726421 +0000 UTC m=+818.025982067" lastFinishedPulling="2026-02-17 09:19:09.322084937 +0000 UTC m=+826.865340623" observedRunningTime="2026-02-17 09:19:10.735995149 +0000 UTC m=+828.279250825" watchObservedRunningTime="2026-02-17 09:19:10.738331388 +0000 UTC m=+828.281587024" Feb 17 09:19:10 crc kubenswrapper[4848]: I0217 09:19:10.759557 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ckggs" podStartSLOduration=3.432892957 podStartE2EDuration="26.75952618s" podCreationTimestamp="2026-02-17 09:18:44 +0000 UTC" firstStartedPulling="2026-02-17 09:18:46.041084604 +0000 UTC m=+803.584340250" lastFinishedPulling="2026-02-17 09:19:09.367717807 +0000 UTC m=+826.910973473" observedRunningTime="2026-02-17 09:19:10.755267875 +0000 UTC m=+828.298523561" watchObservedRunningTime="2026-02-17 09:19:10.75952618 +0000 UTC m=+828.302781836" Feb 17 09:19:10 crc kubenswrapper[4848]: I0217 09:19:10.772307 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-jcd2x" podStartSLOduration=3.511154205 podStartE2EDuration="26.772292275s" podCreationTimestamp="2026-02-17 09:18:44 +0000 UTC" firstStartedPulling="2026-02-17 09:18:46.060571236 +0000 UTC m=+803.603826882" lastFinishedPulling="2026-02-17 09:19:09.321709276 +0000 UTC m=+826.864964952" observedRunningTime="2026-02-17 09:19:10.770986097 +0000 UTC m=+828.314241763" watchObservedRunningTime="2026-02-17 09:19:10.772292275 +0000 UTC m=+828.315547921" Feb 17 09:19:10 crc kubenswrapper[4848]: I0217 09:19:10.793072 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7866795846-4phd8" podStartSLOduration=3.523928571 podStartE2EDuration="26.793056245s" podCreationTimestamp="2026-02-17 09:18:44 +0000 UTC" firstStartedPulling="2026-02-17 09:18:46.06784887 +0000 UTC m=+803.611104516" lastFinishedPulling="2026-02-17 09:19:09.336976544 +0000 UTC m=+826.880232190" observedRunningTime="2026-02-17 09:19:10.788162221 +0000 UTC m=+828.331417907" watchObservedRunningTime="2026-02-17 09:19:10.793056245 +0000 UTC m=+828.336311891" Feb 17 09:19:10 crc kubenswrapper[4848]: I0217 09:19:10.829162 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-7swkp" podStartSLOduration=5.231033595 podStartE2EDuration="26.829139885s" podCreationTimestamp="2026-02-17 09:18:44 +0000 UTC" firstStartedPulling="2026-02-17 09:18:46.047870733 +0000 UTC m=+803.591126379" lastFinishedPulling="2026-02-17 09:19:07.645977023 +0000 UTC m=+825.189232669" observedRunningTime="2026-02-17 09:19:10.815929977 +0000 UTC m=+828.359185633" watchObservedRunningTime="2026-02-17 09:19:10.829139885 +0000 UTC m=+828.372395541" Feb 17 09:19:10 crc kubenswrapper[4848]: I0217 09:19:10.839589 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhb2h" podStartSLOduration=3.467583997 podStartE2EDuration="26.839576272s" podCreationTimestamp="2026-02-17 09:18:44 +0000 UTC" firstStartedPulling="2026-02-17 09:18:46.066887302 +0000 UTC m=+803.610142938" lastFinishedPulling="2026-02-17 09:19:09.438879557 +0000 UTC m=+826.982135213" observedRunningTime="2026-02-17 09:19:10.833166623 +0000 UTC m=+828.376422269" watchObservedRunningTime="2026-02-17 09:19:10.839576272 +0000 UTC m=+828.382831918" Feb 17 09:19:10 crc kubenswrapper[4848]: I0217 09:19:10.873058 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ckxfq" podStartSLOduration=3.550280705 podStartE2EDuration="26.873039335s" podCreationTimestamp="2026-02-17 09:18:44 +0000 UTC" firstStartedPulling="2026-02-17 09:18:46.04572423 +0000 UTC m=+803.588979876" lastFinishedPulling="2026-02-17 09:19:09.36848282 +0000 UTC m=+826.911738506" observedRunningTime="2026-02-17 09:19:10.870637094 +0000 UTC m=+828.413892740" watchObservedRunningTime="2026-02-17 09:19:10.873039335 +0000 UTC m=+828.416294981" Feb 17 09:19:12 crc kubenswrapper[4848]: I0217 09:19:12.743374 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-wlll6" event={"ID":"b2e407ed-c962-4fcf-b367-f4164d644de6","Type":"ContainerStarted","Data":"09848c433565ea0bce21ea69d8f25d541058f33d5603ff9cc1b3349d76dacee5"} Feb 17 09:19:12 crc kubenswrapper[4848]: I0217 09:19:12.744629 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-wlll6" Feb 17 09:19:12 crc kubenswrapper[4848]: I0217 09:19:12.745957 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-rvzql" event={"ID":"88153939-7ca7-448d-a21c-b8330360b5a1","Type":"ContainerStarted","Data":"56adf780667f4495c6c003c44c7e093a06ef2b99664e1a9183f9c6225d5addfc"} Feb 17 09:19:12 crc kubenswrapper[4848]: I0217 09:19:12.746352 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-rvzql" Feb 17 09:19:12 crc kubenswrapper[4848]: I0217 09:19:12.769530 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-wlll6" podStartSLOduration=3.103579299 podStartE2EDuration="29.76951337s" podCreationTimestamp="2026-02-17 09:18:43 +0000 UTC" firstStartedPulling="2026-02-17 09:18:45.42986004 +0000 UTC m=+802.973115676" lastFinishedPulling="2026-02-17 09:19:12.095794101 +0000 UTC m=+829.639049747" observedRunningTime="2026-02-17 09:19:12.767181712 +0000 UTC m=+830.310437388" watchObservedRunningTime="2026-02-17 09:19:12.76951337 +0000 UTC m=+830.312769026" Feb 17 09:19:12 crc kubenswrapper[4848]: I0217 09:19:12.796410 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-rvzql" podStartSLOduration=2.931159348 podStartE2EDuration="28.796383939s" podCreationTimestamp="2026-02-17 09:18:44 +0000 UTC" firstStartedPulling="2026-02-17 09:18:45.969476511 +0000 UTC m=+803.512732157" lastFinishedPulling="2026-02-17 09:19:11.834701102 +0000 UTC m=+829.377956748" observedRunningTime="2026-02-17 09:19:12.795971287 +0000 UTC m=+830.339226943" watchObservedRunningTime="2026-02-17 09:19:12.796383939 +0000 UTC m=+830.339639595" Feb 17 09:19:14 crc kubenswrapper[4848]: I0217 09:19:14.330209 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-vqb7z" Feb 17 09:19:14 crc kubenswrapper[4848]: I0217 09:19:14.475830 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-fbwmm" Feb 17 09:19:14 crc kubenswrapper[4848]: I0217 09:19:14.494292 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987464f4-958lw" Feb 17 09:19:14 crc kubenswrapper[4848]: I0217 09:19:14.515222 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-86z5g" Feb 17 09:19:14 crc kubenswrapper[4848]: I0217 09:19:14.551412 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-5lpnr" Feb 17 09:19:14 crc kubenswrapper[4848]: I0217 09:19:14.614257 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-sw5bf" Feb 17 09:19:14 crc kubenswrapper[4848]: I0217 09:19:14.647280 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-54f6768c69-qd7ds" Feb 17 09:19:14 crc kubenswrapper[4848]: I0217 09:19:14.739159 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64ddbf8bb-fkvjd" Feb 17 09:19:14 crc kubenswrapper[4848]: I0217 09:19:14.797825 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-lth6q" Feb 17 09:19:14 crc kubenswrapper[4848]: I0217 09:19:14.808614 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-69f8888797-mcnbc" Feb 17 09:19:14 crc kubenswrapper[4848]: I0217 09:19:14.820506 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-ckxfq" Feb 17 09:19:14 crc kubenswrapper[4848]: I0217 09:19:14.877366 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-d44cf6b75-ckggs" Feb 17 09:19:14 crc kubenswrapper[4848]: I0217 09:19:14.904717 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-mltd5" Feb 17 09:19:14 crc kubenswrapper[4848]: I0217 09:19:14.927046 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-7f45b4ff68-jcd2x" Feb 17 09:19:15 crc kubenswrapper[4848]: I0217 09:19:15.027284 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7866795846-4phd8" Feb 17 09:19:15 crc kubenswrapper[4848]: I0217 09:19:15.215990 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-7swkp" Feb 17 09:19:15 crc kubenswrapper[4848]: I0217 09:19:15.781040 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-9wvc8" event={"ID":"97430748-300a-434e-a6b3-52274422ab66","Type":"ContainerStarted","Data":"98019d2591d188d35515099b10a32d52f0048d6b14303bcee87b12f58623a5d9"} Feb 17 09:19:15 crc kubenswrapper[4848]: I0217 09:19:15.781375 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-9wvc8" Feb 17 09:19:15 crc kubenswrapper[4848]: I0217 09:19:15.811560 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-9wvc8" podStartSLOduration=2.975673296 podStartE2EDuration="31.811531457s" podCreationTimestamp="2026-02-17 09:18:44 +0000 UTC" firstStartedPulling="2026-02-17 09:18:45.952637236 +0000 UTC m=+803.495892882" lastFinishedPulling="2026-02-17 09:19:14.788495397 +0000 UTC m=+832.331751043" observedRunningTime="2026-02-17 09:19:15.803360427 +0000 UTC m=+833.346616143" watchObservedRunningTime="2026-02-17 09:19:15.811531457 +0000 UTC m=+833.354787143" Feb 17 09:19:16 crc kubenswrapper[4848]: I0217 09:19:16.275833 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f658d1a9-916e-41c9-8268-e94c22c6a045-cert\") pod \"openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr\" (UID: \"f658d1a9-916e-41c9-8268-e94c22c6a045\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr" Feb 17 09:19:16 crc kubenswrapper[4848]: I0217 09:19:16.286554 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f658d1a9-916e-41c9-8268-e94c22c6a045-cert\") pod \"openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr\" (UID: \"f658d1a9-916e-41c9-8268-e94c22c6a045\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr" Feb 17 09:19:16 crc kubenswrapper[4848]: I0217 09:19:16.359071 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr" Feb 17 09:19:16 crc kubenswrapper[4848]: I0217 09:19:16.783218 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-webhook-certs\") pod \"openstack-operator-controller-manager-74d597bfd6-4ptpp\" (UID: \"32b36aa1-7151-443d-9091-bc1e8ea86805\") " pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" Feb 17 09:19:16 crc kubenswrapper[4848]: I0217 09:19:16.783329 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-metrics-certs\") pod \"openstack-operator-controller-manager-74d597bfd6-4ptpp\" (UID: \"32b36aa1-7151-443d-9091-bc1e8ea86805\") " pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" Feb 17 09:19:16 crc kubenswrapper[4848]: I0217 09:19:16.791443 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-metrics-certs\") pod \"openstack-operator-controller-manager-74d597bfd6-4ptpp\" (UID: \"32b36aa1-7151-443d-9091-bc1e8ea86805\") " pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" Feb 17 09:19:16 crc kubenswrapper[4848]: I0217 09:19:16.792100 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/32b36aa1-7151-443d-9091-bc1e8ea86805-webhook-certs\") pod \"openstack-operator-controller-manager-74d597bfd6-4ptpp\" (UID: \"32b36aa1-7151-443d-9091-bc1e8ea86805\") " pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" Feb 17 09:19:16 crc kubenswrapper[4848]: I0217 09:19:16.878474 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr"] Feb 17 09:19:16 crc kubenswrapper[4848]: W0217 09:19:16.893502 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf658d1a9_916e_41c9_8268_e94c22c6a045.slice/crio-7b239e66e7ac5a023d8e97c6b6abc8c5b18eb4993a16e57f01852f9bb0beeb38 WatchSource:0}: Error finding container 7b239e66e7ac5a023d8e97c6b6abc8c5b18eb4993a16e57f01852f9bb0beeb38: Status 404 returned error can't find the container with id 7b239e66e7ac5a023d8e97c6b6abc8c5b18eb4993a16e57f01852f9bb0beeb38 Feb 17 09:19:17 crc kubenswrapper[4848]: I0217 09:19:17.022447 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" Feb 17 09:19:17 crc kubenswrapper[4848]: W0217 09:19:17.325347 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32b36aa1_7151_443d_9091_bc1e8ea86805.slice/crio-801b14e8143b73c4d0d9f6400c5a42790b1065871ac68e160d69a483a959e031 WatchSource:0}: Error finding container 801b14e8143b73c4d0d9f6400c5a42790b1065871ac68e160d69a483a959e031: Status 404 returned error can't find the container with id 801b14e8143b73c4d0d9f6400c5a42790b1065871ac68e160d69a483a959e031 Feb 17 09:19:17 crc kubenswrapper[4848]: I0217 09:19:17.326410 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp"] Feb 17 09:19:17 crc kubenswrapper[4848]: I0217 09:19:17.807309 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" event={"ID":"32b36aa1-7151-443d-9091-bc1e8ea86805","Type":"ContainerStarted","Data":"cc69e3aeda0c1ca431b0c811e72e807acb108c81e2b0ef7f181818753ec9a60f"} Feb 17 09:19:17 crc kubenswrapper[4848]: I0217 09:19:17.807581 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" event={"ID":"32b36aa1-7151-443d-9091-bc1e8ea86805","Type":"ContainerStarted","Data":"801b14e8143b73c4d0d9f6400c5a42790b1065871ac68e160d69a483a959e031"} Feb 17 09:19:17 crc kubenswrapper[4848]: I0217 09:19:17.809045 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" Feb 17 09:19:17 crc kubenswrapper[4848]: I0217 09:19:17.810751 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr" event={"ID":"f658d1a9-916e-41c9-8268-e94c22c6a045","Type":"ContainerStarted","Data":"7b239e66e7ac5a023d8e97c6b6abc8c5b18eb4993a16e57f01852f9bb0beeb38"} Feb 17 09:19:17 crc kubenswrapper[4848]: I0217 09:19:17.836276 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" podStartSLOduration=33.836259461 podStartE2EDuration="33.836259461s" podCreationTimestamp="2026-02-17 09:18:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:19:17.829554974 +0000 UTC m=+835.372810610" watchObservedRunningTime="2026-02-17 09:19:17.836259461 +0000 UTC m=+835.379515107" Feb 17 09:19:18 crc kubenswrapper[4848]: I0217 09:19:18.771966 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:19:18 crc kubenswrapper[4848]: I0217 09:19:18.772016 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:19:18 crc kubenswrapper[4848]: I0217 09:19:18.772055 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:19:18 crc kubenswrapper[4848]: I0217 09:19:18.772544 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cf7c734d597165a992ca275dfa403ad67456d929b1c93b35482f6a777604c954"} pod="openshift-machine-config-operator/machine-config-daemon-stvnz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 09:19:18 crc kubenswrapper[4848]: I0217 09:19:18.772598 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" containerID="cri-o://cf7c734d597165a992ca275dfa403ad67456d929b1c93b35482f6a777604c954" gracePeriod=600 Feb 17 09:19:19 crc kubenswrapper[4848]: I0217 09:19:19.828162 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr" event={"ID":"f658d1a9-916e-41c9-8268-e94c22c6a045","Type":"ContainerStarted","Data":"4231fdf479228a4952730512c440c813596c9642609e4707676965a2fa035186"} Feb 17 09:19:19 crc kubenswrapper[4848]: I0217 09:19:19.828709 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr" Feb 17 09:19:19 crc kubenswrapper[4848]: I0217 09:19:19.831550 4848 generic.go:334] "Generic (PLEG): container finished" podID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerID="cf7c734d597165a992ca275dfa403ad67456d929b1c93b35482f6a777604c954" exitCode=0 Feb 17 09:19:19 crc kubenswrapper[4848]: I0217 09:19:19.831629 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerDied","Data":"cf7c734d597165a992ca275dfa403ad67456d929b1c93b35482f6a777604c954"} Feb 17 09:19:19 crc kubenswrapper[4848]: I0217 09:19:19.831687 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerStarted","Data":"394ea7e530ba2c06a8e2c57a8f43255e3afc07d0c7f59b99a48b84ecd7fdc2a0"} Feb 17 09:19:19 crc kubenswrapper[4848]: I0217 09:19:19.831710 4848 scope.go:117] "RemoveContainer" containerID="22658753e35aaed607fe5a9bec19a88c03d7cd46abf15a9cf79a3c5b734a8c5c" Feb 17 09:19:19 crc kubenswrapper[4848]: I0217 09:19:19.859781 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr" podStartSLOduration=33.965421934 podStartE2EDuration="35.859731848s" podCreationTimestamp="2026-02-17 09:18:44 +0000 UTC" firstStartedPulling="2026-02-17 09:19:16.896031783 +0000 UTC m=+834.439287429" lastFinishedPulling="2026-02-17 09:19:18.790341697 +0000 UTC m=+836.333597343" observedRunningTime="2026-02-17 09:19:19.856268786 +0000 UTC m=+837.399524493" watchObservedRunningTime="2026-02-17 09:19:19.859731848 +0000 UTC m=+837.402987534" Feb 17 09:19:20 crc kubenswrapper[4848]: I0217 09:19:20.127907 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-dlmq4" Feb 17 09:19:24 crc kubenswrapper[4848]: I0217 09:19:24.463196 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-5d946d989d-wlll6" Feb 17 09:19:24 crc kubenswrapper[4848]: I0217 09:19:24.622166 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-9wvc8" Feb 17 09:19:24 crc kubenswrapper[4848]: I0217 09:19:24.800588 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-rvzql" Feb 17 09:19:26 crc kubenswrapper[4848]: I0217 09:19:26.365309 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr" Feb 17 09:19:27 crc kubenswrapper[4848]: I0217 09:19:27.033194 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-74d597bfd6-4ptpp" Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.039627 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-v8qhl"] Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.041640 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-v8qhl" Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.045623 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.045793 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.045863 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.046022 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-zpvrn" Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.055214 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-v8qhl"] Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.095517 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-mx8rm"] Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.104308 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-mx8rm" Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.107070 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.107070 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-mx8rm"] Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.220123 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn62j\" (UniqueName: \"kubernetes.io/projected/6446f1e4-f3de-4729-a5b9-f1e9974bfc67-kube-api-access-qn62j\") pod \"dnsmasq-dns-855cbc58c5-v8qhl\" (UID: \"6446f1e4-f3de-4729-a5b9-f1e9974bfc67\") " pod="openstack/dnsmasq-dns-855cbc58c5-v8qhl" Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.220273 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04fea754-2f62-4c33-943e-e602f4875d21-config\") pod \"dnsmasq-dns-6fcf94d689-mx8rm\" (UID: \"04fea754-2f62-4c33-943e-e602f4875d21\") " pod="openstack/dnsmasq-dns-6fcf94d689-mx8rm" Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.220490 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04fea754-2f62-4c33-943e-e602f4875d21-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-mx8rm\" (UID: \"04fea754-2f62-4c33-943e-e602f4875d21\") " pod="openstack/dnsmasq-dns-6fcf94d689-mx8rm" Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.221013 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6446f1e4-f3de-4729-a5b9-f1e9974bfc67-config\") pod \"dnsmasq-dns-855cbc58c5-v8qhl\" (UID: \"6446f1e4-f3de-4729-a5b9-f1e9974bfc67\") " pod="openstack/dnsmasq-dns-855cbc58c5-v8qhl" Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.221292 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhpjr\" (UniqueName: \"kubernetes.io/projected/04fea754-2f62-4c33-943e-e602f4875d21-kube-api-access-hhpjr\") pod \"dnsmasq-dns-6fcf94d689-mx8rm\" (UID: \"04fea754-2f62-4c33-943e-e602f4875d21\") " pod="openstack/dnsmasq-dns-6fcf94d689-mx8rm" Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.322302 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6446f1e4-f3de-4729-a5b9-f1e9974bfc67-config\") pod \"dnsmasq-dns-855cbc58c5-v8qhl\" (UID: \"6446f1e4-f3de-4729-a5b9-f1e9974bfc67\") " pod="openstack/dnsmasq-dns-855cbc58c5-v8qhl" Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.322374 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhpjr\" (UniqueName: \"kubernetes.io/projected/04fea754-2f62-4c33-943e-e602f4875d21-kube-api-access-hhpjr\") pod \"dnsmasq-dns-6fcf94d689-mx8rm\" (UID: \"04fea754-2f62-4c33-943e-e602f4875d21\") " pod="openstack/dnsmasq-dns-6fcf94d689-mx8rm" Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.322421 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn62j\" (UniqueName: \"kubernetes.io/projected/6446f1e4-f3de-4729-a5b9-f1e9974bfc67-kube-api-access-qn62j\") pod \"dnsmasq-dns-855cbc58c5-v8qhl\" (UID: \"6446f1e4-f3de-4729-a5b9-f1e9974bfc67\") " pod="openstack/dnsmasq-dns-855cbc58c5-v8qhl" Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.322452 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04fea754-2f62-4c33-943e-e602f4875d21-config\") pod \"dnsmasq-dns-6fcf94d689-mx8rm\" (UID: \"04fea754-2f62-4c33-943e-e602f4875d21\") " pod="openstack/dnsmasq-dns-6fcf94d689-mx8rm" Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.322479 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04fea754-2f62-4c33-943e-e602f4875d21-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-mx8rm\" (UID: \"04fea754-2f62-4c33-943e-e602f4875d21\") " pod="openstack/dnsmasq-dns-6fcf94d689-mx8rm" Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.323537 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6446f1e4-f3de-4729-a5b9-f1e9974bfc67-config\") pod \"dnsmasq-dns-855cbc58c5-v8qhl\" (UID: \"6446f1e4-f3de-4729-a5b9-f1e9974bfc67\") " pod="openstack/dnsmasq-dns-855cbc58c5-v8qhl" Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.324117 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04fea754-2f62-4c33-943e-e602f4875d21-config\") pod \"dnsmasq-dns-6fcf94d689-mx8rm\" (UID: \"04fea754-2f62-4c33-943e-e602f4875d21\") " pod="openstack/dnsmasq-dns-6fcf94d689-mx8rm" Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.330360 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04fea754-2f62-4c33-943e-e602f4875d21-dns-svc\") pod \"dnsmasq-dns-6fcf94d689-mx8rm\" (UID: \"04fea754-2f62-4c33-943e-e602f4875d21\") " pod="openstack/dnsmasq-dns-6fcf94d689-mx8rm" Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.344251 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhpjr\" (UniqueName: \"kubernetes.io/projected/04fea754-2f62-4c33-943e-e602f4875d21-kube-api-access-hhpjr\") pod \"dnsmasq-dns-6fcf94d689-mx8rm\" (UID: \"04fea754-2f62-4c33-943e-e602f4875d21\") " pod="openstack/dnsmasq-dns-6fcf94d689-mx8rm" Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.348828 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn62j\" (UniqueName: \"kubernetes.io/projected/6446f1e4-f3de-4729-a5b9-f1e9974bfc67-kube-api-access-qn62j\") pod \"dnsmasq-dns-855cbc58c5-v8qhl\" (UID: \"6446f1e4-f3de-4729-a5b9-f1e9974bfc67\") " pod="openstack/dnsmasq-dns-855cbc58c5-v8qhl" Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.361817 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-v8qhl" Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.428662 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-mx8rm" Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.790641 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-v8qhl"] Feb 17 09:19:45 crc kubenswrapper[4848]: I0217 09:19:45.900586 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-mx8rm"] Feb 17 09:19:45 crc kubenswrapper[4848]: W0217 09:19:45.910356 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04fea754_2f62_4c33_943e_e602f4875d21.slice/crio-37cefdbf0333384eaa9582e2c12d42c9c4e30a21b5d00f852ae603490b8b25f5 WatchSource:0}: Error finding container 37cefdbf0333384eaa9582e2c12d42c9c4e30a21b5d00f852ae603490b8b25f5: Status 404 returned error can't find the container with id 37cefdbf0333384eaa9582e2c12d42c9c4e30a21b5d00f852ae603490b8b25f5 Feb 17 09:19:46 crc kubenswrapper[4848]: I0217 09:19:46.039646 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcf94d689-mx8rm" event={"ID":"04fea754-2f62-4c33-943e-e602f4875d21","Type":"ContainerStarted","Data":"37cefdbf0333384eaa9582e2c12d42c9c4e30a21b5d00f852ae603490b8b25f5"} Feb 17 09:19:46 crc kubenswrapper[4848]: I0217 09:19:46.041050 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855cbc58c5-v8qhl" event={"ID":"6446f1e4-f3de-4729-a5b9-f1e9974bfc67","Type":"ContainerStarted","Data":"79ff1d96e8fdcb4aaa7a8f65f2be6f3de4a9bda9acfb46fa2478a5e6f37241e9"} Feb 17 09:19:47 crc kubenswrapper[4848]: I0217 09:19:47.730263 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-mx8rm"] Feb 17 09:19:47 crc kubenswrapper[4848]: I0217 09:19:47.749342 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-d4ktk"] Feb 17 09:19:47 crc kubenswrapper[4848]: I0217 09:19:47.750488 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-d4ktk" Feb 17 09:19:47 crc kubenswrapper[4848]: I0217 09:19:47.767989 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-d4ktk"] Feb 17 09:19:47 crc kubenswrapper[4848]: I0217 09:19:47.859624 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgx22\" (UniqueName: \"kubernetes.io/projected/81f878e8-a78a-43d8-8f54-e10d5393335d-kube-api-access-fgx22\") pod \"dnsmasq-dns-f54874ffc-d4ktk\" (UID: \"81f878e8-a78a-43d8-8f54-e10d5393335d\") " pod="openstack/dnsmasq-dns-f54874ffc-d4ktk" Feb 17 09:19:47 crc kubenswrapper[4848]: I0217 09:19:47.859693 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81f878e8-a78a-43d8-8f54-e10d5393335d-config\") pod \"dnsmasq-dns-f54874ffc-d4ktk\" (UID: \"81f878e8-a78a-43d8-8f54-e10d5393335d\") " pod="openstack/dnsmasq-dns-f54874ffc-d4ktk" Feb 17 09:19:47 crc kubenswrapper[4848]: I0217 09:19:47.859775 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81f878e8-a78a-43d8-8f54-e10d5393335d-dns-svc\") pod \"dnsmasq-dns-f54874ffc-d4ktk\" (UID: \"81f878e8-a78a-43d8-8f54-e10d5393335d\") " pod="openstack/dnsmasq-dns-f54874ffc-d4ktk" Feb 17 09:19:47 crc kubenswrapper[4848]: I0217 09:19:47.961458 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgx22\" (UniqueName: \"kubernetes.io/projected/81f878e8-a78a-43d8-8f54-e10d5393335d-kube-api-access-fgx22\") pod \"dnsmasq-dns-f54874ffc-d4ktk\" (UID: \"81f878e8-a78a-43d8-8f54-e10d5393335d\") " pod="openstack/dnsmasq-dns-f54874ffc-d4ktk" Feb 17 09:19:47 crc kubenswrapper[4848]: I0217 09:19:47.961514 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81f878e8-a78a-43d8-8f54-e10d5393335d-config\") pod \"dnsmasq-dns-f54874ffc-d4ktk\" (UID: \"81f878e8-a78a-43d8-8f54-e10d5393335d\") " pod="openstack/dnsmasq-dns-f54874ffc-d4ktk" Feb 17 09:19:47 crc kubenswrapper[4848]: I0217 09:19:47.961582 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81f878e8-a78a-43d8-8f54-e10d5393335d-dns-svc\") pod \"dnsmasq-dns-f54874ffc-d4ktk\" (UID: \"81f878e8-a78a-43d8-8f54-e10d5393335d\") " pod="openstack/dnsmasq-dns-f54874ffc-d4ktk" Feb 17 09:19:47 crc kubenswrapper[4848]: I0217 09:19:47.962454 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81f878e8-a78a-43d8-8f54-e10d5393335d-dns-svc\") pod \"dnsmasq-dns-f54874ffc-d4ktk\" (UID: \"81f878e8-a78a-43d8-8f54-e10d5393335d\") " pod="openstack/dnsmasq-dns-f54874ffc-d4ktk" Feb 17 09:19:47 crc kubenswrapper[4848]: I0217 09:19:47.962643 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81f878e8-a78a-43d8-8f54-e10d5393335d-config\") pod \"dnsmasq-dns-f54874ffc-d4ktk\" (UID: \"81f878e8-a78a-43d8-8f54-e10d5393335d\") " pod="openstack/dnsmasq-dns-f54874ffc-d4ktk" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.003014 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgx22\" (UniqueName: \"kubernetes.io/projected/81f878e8-a78a-43d8-8f54-e10d5393335d-kube-api-access-fgx22\") pod \"dnsmasq-dns-f54874ffc-d4ktk\" (UID: \"81f878e8-a78a-43d8-8f54-e10d5393335d\") " pod="openstack/dnsmasq-dns-f54874ffc-d4ktk" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.024220 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-v8qhl"] Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.060484 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-7lgm2"] Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.062025 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-7lgm2" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.078026 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-d4ktk" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.079779 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-7lgm2"] Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.164287 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6vxr\" (UniqueName: \"kubernetes.io/projected/84f8e59b-6699-4b32-8772-d9347fd21259-kube-api-access-m6vxr\") pod \"dnsmasq-dns-67ff45466c-7lgm2\" (UID: \"84f8e59b-6699-4b32-8772-d9347fd21259\") " pod="openstack/dnsmasq-dns-67ff45466c-7lgm2" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.164342 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84f8e59b-6699-4b32-8772-d9347fd21259-dns-svc\") pod \"dnsmasq-dns-67ff45466c-7lgm2\" (UID: \"84f8e59b-6699-4b32-8772-d9347fd21259\") " pod="openstack/dnsmasq-dns-67ff45466c-7lgm2" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.164367 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84f8e59b-6699-4b32-8772-d9347fd21259-config\") pod \"dnsmasq-dns-67ff45466c-7lgm2\" (UID: \"84f8e59b-6699-4b32-8772-d9347fd21259\") " pod="openstack/dnsmasq-dns-67ff45466c-7lgm2" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.265508 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6vxr\" (UniqueName: \"kubernetes.io/projected/84f8e59b-6699-4b32-8772-d9347fd21259-kube-api-access-m6vxr\") pod \"dnsmasq-dns-67ff45466c-7lgm2\" (UID: \"84f8e59b-6699-4b32-8772-d9347fd21259\") " pod="openstack/dnsmasq-dns-67ff45466c-7lgm2" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.266027 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84f8e59b-6699-4b32-8772-d9347fd21259-dns-svc\") pod \"dnsmasq-dns-67ff45466c-7lgm2\" (UID: \"84f8e59b-6699-4b32-8772-d9347fd21259\") " pod="openstack/dnsmasq-dns-67ff45466c-7lgm2" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.266056 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84f8e59b-6699-4b32-8772-d9347fd21259-config\") pod \"dnsmasq-dns-67ff45466c-7lgm2\" (UID: \"84f8e59b-6699-4b32-8772-d9347fd21259\") " pod="openstack/dnsmasq-dns-67ff45466c-7lgm2" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.267111 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84f8e59b-6699-4b32-8772-d9347fd21259-config\") pod \"dnsmasq-dns-67ff45466c-7lgm2\" (UID: \"84f8e59b-6699-4b32-8772-d9347fd21259\") " pod="openstack/dnsmasq-dns-67ff45466c-7lgm2" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.346587 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6vxr\" (UniqueName: \"kubernetes.io/projected/84f8e59b-6699-4b32-8772-d9347fd21259-kube-api-access-m6vxr\") pod \"dnsmasq-dns-67ff45466c-7lgm2\" (UID: \"84f8e59b-6699-4b32-8772-d9347fd21259\") " pod="openstack/dnsmasq-dns-67ff45466c-7lgm2" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.372907 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84f8e59b-6699-4b32-8772-d9347fd21259-dns-svc\") pod \"dnsmasq-dns-67ff45466c-7lgm2\" (UID: \"84f8e59b-6699-4b32-8772-d9347fd21259\") " pod="openstack/dnsmasq-dns-67ff45466c-7lgm2" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.397118 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-7lgm2" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.537636 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-d4ktk"] Feb 17 09:19:48 crc kubenswrapper[4848]: W0217 09:19:48.545806 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81f878e8_a78a_43d8_8f54_e10d5393335d.slice/crio-4f69e345f8e20d3a9b0e97714253418b57d4a006cac6a97b01f82b85f5c69ce9 WatchSource:0}: Error finding container 4f69e345f8e20d3a9b0e97714253418b57d4a006cac6a97b01f82b85f5c69ce9: Status 404 returned error can't find the container with id 4f69e345f8e20d3a9b0e97714253418b57d4a006cac6a97b01f82b85f5c69ce9 Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.896052 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.897752 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.902960 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-sbvg9" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.904540 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.904699 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.904820 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.904830 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.905007 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.905216 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.907946 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.948331 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-7lgm2"] Feb 17 09:19:48 crc kubenswrapper[4848]: W0217 09:19:48.954980 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod84f8e59b_6699_4b32_8772_d9347fd21259.slice/crio-e131e7b00e99d1c78aee7159b862ef6c1f27b4abf0026fd0355a2d5a6ec3cb16 WatchSource:0}: Error finding container e131e7b00e99d1c78aee7159b862ef6c1f27b4abf0026fd0355a2d5a6ec3cb16: Status 404 returned error can't find the container with id e131e7b00e99d1c78aee7159b862ef6c1f27b4abf0026fd0355a2d5a6ec3cb16 Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.991430 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db50eaa9-ca0a-4a83-98d8-fce82f849d91-config-data\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.991478 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/db50eaa9-ca0a-4a83-98d8-fce82f849d91-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.991498 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.991561 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/db50eaa9-ca0a-4a83-98d8-fce82f849d91-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.991604 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhcn9\" (UniqueName: \"kubernetes.io/projected/db50eaa9-ca0a-4a83-98d8-fce82f849d91-kube-api-access-xhcn9\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.991625 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/db50eaa9-ca0a-4a83-98d8-fce82f849d91-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.991650 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/db50eaa9-ca0a-4a83-98d8-fce82f849d91-pod-info\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.991669 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/db50eaa9-ca0a-4a83-98d8-fce82f849d91-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.991692 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/db50eaa9-ca0a-4a83-98d8-fce82f849d91-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.991747 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/db50eaa9-ca0a-4a83-98d8-fce82f849d91-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:48 crc kubenswrapper[4848]: I0217 09:19:48.991782 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/db50eaa9-ca0a-4a83-98d8-fce82f849d91-server-conf\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.069798 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-7lgm2" event={"ID":"84f8e59b-6699-4b32-8772-d9347fd21259","Type":"ContainerStarted","Data":"e131e7b00e99d1c78aee7159b862ef6c1f27b4abf0026fd0355a2d5a6ec3cb16"} Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.071204 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-d4ktk" event={"ID":"81f878e8-a78a-43d8-8f54-e10d5393335d","Type":"ContainerStarted","Data":"4f69e345f8e20d3a9b0e97714253418b57d4a006cac6a97b01f82b85f5c69ce9"} Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.092855 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/db50eaa9-ca0a-4a83-98d8-fce82f849d91-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.092910 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/db50eaa9-ca0a-4a83-98d8-fce82f849d91-server-conf\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.092986 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db50eaa9-ca0a-4a83-98d8-fce82f849d91-config-data\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.093008 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/db50eaa9-ca0a-4a83-98d8-fce82f849d91-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.093036 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.093064 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/db50eaa9-ca0a-4a83-98d8-fce82f849d91-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.093093 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhcn9\" (UniqueName: \"kubernetes.io/projected/db50eaa9-ca0a-4a83-98d8-fce82f849d91-kube-api-access-xhcn9\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.093117 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/db50eaa9-ca0a-4a83-98d8-fce82f849d91-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.093159 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/db50eaa9-ca0a-4a83-98d8-fce82f849d91-pod-info\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.093179 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/db50eaa9-ca0a-4a83-98d8-fce82f849d91-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.093212 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/db50eaa9-ca0a-4a83-98d8-fce82f849d91-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.093657 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/db50eaa9-ca0a-4a83-98d8-fce82f849d91-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.094019 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.094873 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/db50eaa9-ca0a-4a83-98d8-fce82f849d91-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.095650 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db50eaa9-ca0a-4a83-98d8-fce82f849d91-config-data\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.095693 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/db50eaa9-ca0a-4a83-98d8-fce82f849d91-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.097582 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/db50eaa9-ca0a-4a83-98d8-fce82f849d91-server-conf\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.102305 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/db50eaa9-ca0a-4a83-98d8-fce82f849d91-pod-info\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.103031 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/db50eaa9-ca0a-4a83-98d8-fce82f849d91-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.103704 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/db50eaa9-ca0a-4a83-98d8-fce82f849d91-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.118997 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhcn9\" (UniqueName: \"kubernetes.io/projected/db50eaa9-ca0a-4a83-98d8-fce82f849d91-kube-api-access-xhcn9\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.122321 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/db50eaa9-ca0a-4a83-98d8-fce82f849d91-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.125648 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " pod="openstack/rabbitmq-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.192208 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.194291 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.200208 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.200799 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.200848 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.200862 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.200982 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.201107 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.201257 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-vsrr6" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.201746 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.225101 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.401312 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd7e9b9b-99f0-4720-b997-3f00996972e5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.401360 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59skc\" (UniqueName: \"kubernetes.io/projected/bd7e9b9b-99f0-4720-b997-3f00996972e5-kube-api-access-59skc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.401393 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd7e9b9b-99f0-4720-b997-3f00996972e5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.401421 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd7e9b9b-99f0-4720-b997-3f00996972e5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.401447 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd7e9b9b-99f0-4720-b997-3f00996972e5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.401519 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd7e9b9b-99f0-4720-b997-3f00996972e5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.401679 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd7e9b9b-99f0-4720-b997-3f00996972e5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.401735 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd7e9b9b-99f0-4720-b997-3f00996972e5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.401813 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.401867 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bd7e9b9b-99f0-4720-b997-3f00996972e5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.401910 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd7e9b9b-99f0-4720-b997-3f00996972e5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.503689 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.503752 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bd7e9b9b-99f0-4720-b997-3f00996972e5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.503824 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd7e9b9b-99f0-4720-b997-3f00996972e5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.503879 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd7e9b9b-99f0-4720-b997-3f00996972e5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.503905 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59skc\" (UniqueName: \"kubernetes.io/projected/bd7e9b9b-99f0-4720-b997-3f00996972e5-kube-api-access-59skc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.504027 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd7e9b9b-99f0-4720-b997-3f00996972e5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.504053 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd7e9b9b-99f0-4720-b997-3f00996972e5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.504074 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd7e9b9b-99f0-4720-b997-3f00996972e5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.504108 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd7e9b9b-99f0-4720-b997-3f00996972e5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.504169 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd7e9b9b-99f0-4720-b997-3f00996972e5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.504673 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd7e9b9b-99f0-4720-b997-3f00996972e5-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.505327 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd7e9b9b-99f0-4720-b997-3f00996972e5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.505944 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd7e9b9b-99f0-4720-b997-3f00996972e5-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.506365 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.506724 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd7e9b9b-99f0-4720-b997-3f00996972e5-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.506812 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd7e9b9b-99f0-4720-b997-3f00996972e5-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.506938 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd7e9b9b-99f0-4720-b997-3f00996972e5-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.509954 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd7e9b9b-99f0-4720-b997-3f00996972e5-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.510198 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd7e9b9b-99f0-4720-b997-3f00996972e5-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.514009 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd7e9b9b-99f0-4720-b997-3f00996972e5-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.514707 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bd7e9b9b-99f0-4720-b997-3f00996972e5-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.522849 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59skc\" (UniqueName: \"kubernetes.io/projected/bd7e9b9b-99f0-4720-b997-3f00996972e5-kube-api-access-59skc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.557168 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:49 crc kubenswrapper[4848]: I0217 09:19:49.820738 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.414068 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.415320 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.418703 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.418924 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.419829 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-g82lz" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.421614 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.421716 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.434392 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.522390 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac51f8f5-cf36-44ef-b849-9bd6265e5156-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ac51f8f5-cf36-44ef-b849-9bd6265e5156\") " pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.522698 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ac51f8f5-cf36-44ef-b849-9bd6265e5156-kolla-config\") pod \"openstack-galera-0\" (UID: \"ac51f8f5-cf36-44ef-b849-9bd6265e5156\") " pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.522744 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"ac51f8f5-cf36-44ef-b849-9bd6265e5156\") " pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.522784 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ac51f8f5-cf36-44ef-b849-9bd6265e5156-config-data-default\") pod \"openstack-galera-0\" (UID: \"ac51f8f5-cf36-44ef-b849-9bd6265e5156\") " pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.522805 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xz4s\" (UniqueName: \"kubernetes.io/projected/ac51f8f5-cf36-44ef-b849-9bd6265e5156-kube-api-access-5xz4s\") pod \"openstack-galera-0\" (UID: \"ac51f8f5-cf36-44ef-b849-9bd6265e5156\") " pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.522838 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ac51f8f5-cf36-44ef-b849-9bd6265e5156-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ac51f8f5-cf36-44ef-b849-9bd6265e5156\") " pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.522914 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac51f8f5-cf36-44ef-b849-9bd6265e5156-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ac51f8f5-cf36-44ef-b849-9bd6265e5156\") " pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.522953 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac51f8f5-cf36-44ef-b849-9bd6265e5156-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ac51f8f5-cf36-44ef-b849-9bd6265e5156\") " pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.624636 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac51f8f5-cf36-44ef-b849-9bd6265e5156-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ac51f8f5-cf36-44ef-b849-9bd6265e5156\") " pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.624713 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac51f8f5-cf36-44ef-b849-9bd6265e5156-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ac51f8f5-cf36-44ef-b849-9bd6265e5156\") " pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.624736 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac51f8f5-cf36-44ef-b849-9bd6265e5156-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ac51f8f5-cf36-44ef-b849-9bd6265e5156\") " pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.625878 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ac51f8f5-cf36-44ef-b849-9bd6265e5156-kolla-config\") pod \"openstack-galera-0\" (UID: \"ac51f8f5-cf36-44ef-b849-9bd6265e5156\") " pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.626021 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"ac51f8f5-cf36-44ef-b849-9bd6265e5156\") " pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.626109 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ac51f8f5-cf36-44ef-b849-9bd6265e5156-config-data-default\") pod \"openstack-galera-0\" (UID: \"ac51f8f5-cf36-44ef-b849-9bd6265e5156\") " pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.626132 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xz4s\" (UniqueName: \"kubernetes.io/projected/ac51f8f5-cf36-44ef-b849-9bd6265e5156-kube-api-access-5xz4s\") pod \"openstack-galera-0\" (UID: \"ac51f8f5-cf36-44ef-b849-9bd6265e5156\") " pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.626185 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ac51f8f5-cf36-44ef-b849-9bd6265e5156-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ac51f8f5-cf36-44ef-b849-9bd6265e5156\") " pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.626600 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"ac51f8f5-cf36-44ef-b849-9bd6265e5156\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.626660 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ac51f8f5-cf36-44ef-b849-9bd6265e5156-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ac51f8f5-cf36-44ef-b849-9bd6265e5156\") " pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.627476 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ac51f8f5-cf36-44ef-b849-9bd6265e5156-kolla-config\") pod \"openstack-galera-0\" (UID: \"ac51f8f5-cf36-44ef-b849-9bd6265e5156\") " pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.630441 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ac51f8f5-cf36-44ef-b849-9bd6265e5156-config-data-default\") pod \"openstack-galera-0\" (UID: \"ac51f8f5-cf36-44ef-b849-9bd6265e5156\") " pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.635139 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac51f8f5-cf36-44ef-b849-9bd6265e5156-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ac51f8f5-cf36-44ef-b849-9bd6265e5156\") " pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.665286 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xz4s\" (UniqueName: \"kubernetes.io/projected/ac51f8f5-cf36-44ef-b849-9bd6265e5156-kube-api-access-5xz4s\") pod \"openstack-galera-0\" (UID: \"ac51f8f5-cf36-44ef-b849-9bd6265e5156\") " pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.626504 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac51f8f5-cf36-44ef-b849-9bd6265e5156-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ac51f8f5-cf36-44ef-b849-9bd6265e5156\") " pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.673323 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"ac51f8f5-cf36-44ef-b849-9bd6265e5156\") " pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.675587 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac51f8f5-cf36-44ef-b849-9bd6265e5156-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ac51f8f5-cf36-44ef-b849-9bd6265e5156\") " pod="openstack/openstack-galera-0" Feb 17 09:19:50 crc kubenswrapper[4848]: I0217 09:19:50.752593 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.717158 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.718417 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.720579 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.720824 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-9d56h" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.721150 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.723001 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.724425 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.845006 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241bdede-0e36-4cfa-965b-89449d5f84f0-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"241bdede-0e36-4cfa-965b-89449d5f84f0\") " pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.845062 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/241bdede-0e36-4cfa-965b-89449d5f84f0-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"241bdede-0e36-4cfa-965b-89449d5f84f0\") " pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.845100 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"241bdede-0e36-4cfa-965b-89449d5f84f0\") " pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.845132 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/241bdede-0e36-4cfa-965b-89449d5f84f0-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"241bdede-0e36-4cfa-965b-89449d5f84f0\") " pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.845157 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/241bdede-0e36-4cfa-965b-89449d5f84f0-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"241bdede-0e36-4cfa-965b-89449d5f84f0\") " pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.845216 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/241bdede-0e36-4cfa-965b-89449d5f84f0-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"241bdede-0e36-4cfa-965b-89449d5f84f0\") " pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.845234 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt8vz\" (UniqueName: \"kubernetes.io/projected/241bdede-0e36-4cfa-965b-89449d5f84f0-kube-api-access-mt8vz\") pod \"openstack-cell1-galera-0\" (UID: \"241bdede-0e36-4cfa-965b-89449d5f84f0\") " pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.845258 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/241bdede-0e36-4cfa-965b-89449d5f84f0-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"241bdede-0e36-4cfa-965b-89449d5f84f0\") " pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.946808 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"241bdede-0e36-4cfa-965b-89449d5f84f0\") " pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.947138 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"241bdede-0e36-4cfa-965b-89449d5f84f0\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.947281 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/241bdede-0e36-4cfa-965b-89449d5f84f0-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"241bdede-0e36-4cfa-965b-89449d5f84f0\") " pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.947325 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/241bdede-0e36-4cfa-965b-89449d5f84f0-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"241bdede-0e36-4cfa-965b-89449d5f84f0\") " pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.947430 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/241bdede-0e36-4cfa-965b-89449d5f84f0-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"241bdede-0e36-4cfa-965b-89449d5f84f0\") " pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.947448 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mt8vz\" (UniqueName: \"kubernetes.io/projected/241bdede-0e36-4cfa-965b-89449d5f84f0-kube-api-access-mt8vz\") pod \"openstack-cell1-galera-0\" (UID: \"241bdede-0e36-4cfa-965b-89449d5f84f0\") " pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.947477 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/241bdede-0e36-4cfa-965b-89449d5f84f0-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"241bdede-0e36-4cfa-965b-89449d5f84f0\") " pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.947542 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241bdede-0e36-4cfa-965b-89449d5f84f0-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"241bdede-0e36-4cfa-965b-89449d5f84f0\") " pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.947596 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/241bdede-0e36-4cfa-965b-89449d5f84f0-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"241bdede-0e36-4cfa-965b-89449d5f84f0\") " pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.948210 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/241bdede-0e36-4cfa-965b-89449d5f84f0-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"241bdede-0e36-4cfa-965b-89449d5f84f0\") " pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.948727 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/241bdede-0e36-4cfa-965b-89449d5f84f0-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"241bdede-0e36-4cfa-965b-89449d5f84f0\") " pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.949036 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/241bdede-0e36-4cfa-965b-89449d5f84f0-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"241bdede-0e36-4cfa-965b-89449d5f84f0\") " pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.950189 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/241bdede-0e36-4cfa-965b-89449d5f84f0-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"241bdede-0e36-4cfa-965b-89449d5f84f0\") " pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.950870 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/241bdede-0e36-4cfa-965b-89449d5f84f0-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"241bdede-0e36-4cfa-965b-89449d5f84f0\") " pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.962404 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/241bdede-0e36-4cfa-965b-89449d5f84f0-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"241bdede-0e36-4cfa-965b-89449d5f84f0\") " pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.979105 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt8vz\" (UniqueName: \"kubernetes.io/projected/241bdede-0e36-4cfa-965b-89449d5f84f0-kube-api-access-mt8vz\") pod \"openstack-cell1-galera-0\" (UID: \"241bdede-0e36-4cfa-965b-89449d5f84f0\") " pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:51 crc kubenswrapper[4848]: I0217 09:19:51.982752 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"openstack-cell1-galera-0\" (UID: \"241bdede-0e36-4cfa-965b-89449d5f84f0\") " pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:52 crc kubenswrapper[4848]: I0217 09:19:52.036289 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 17 09:19:52 crc kubenswrapper[4848]: I0217 09:19:52.037403 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 17 09:19:52 crc kubenswrapper[4848]: I0217 09:19:52.039589 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 17 09:19:52 crc kubenswrapper[4848]: I0217 09:19:52.039797 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 17 09:19:52 crc kubenswrapper[4848]: I0217 09:19:52.042265 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-9ml2j" Feb 17 09:19:52 crc kubenswrapper[4848]: I0217 09:19:52.044244 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 17 09:19:52 crc kubenswrapper[4848]: I0217 09:19:52.051769 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 17 09:19:52 crc kubenswrapper[4848]: I0217 09:19:52.150268 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c60672b9-d590-48a6-80c0-e3f74547b5c2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c60672b9-d590-48a6-80c0-e3f74547b5c2\") " pod="openstack/memcached-0" Feb 17 09:19:52 crc kubenswrapper[4848]: I0217 09:19:52.150323 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c60672b9-d590-48a6-80c0-e3f74547b5c2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c60672b9-d590-48a6-80c0-e3f74547b5c2\") " pod="openstack/memcached-0" Feb 17 09:19:52 crc kubenswrapper[4848]: I0217 09:19:52.150418 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c60672b9-d590-48a6-80c0-e3f74547b5c2-config-data\") pod \"memcached-0\" (UID: \"c60672b9-d590-48a6-80c0-e3f74547b5c2\") " pod="openstack/memcached-0" Feb 17 09:19:52 crc kubenswrapper[4848]: I0217 09:19:52.150451 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfp99\" (UniqueName: \"kubernetes.io/projected/c60672b9-d590-48a6-80c0-e3f74547b5c2-kube-api-access-lfp99\") pod \"memcached-0\" (UID: \"c60672b9-d590-48a6-80c0-e3f74547b5c2\") " pod="openstack/memcached-0" Feb 17 09:19:52 crc kubenswrapper[4848]: I0217 09:19:52.150479 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c60672b9-d590-48a6-80c0-e3f74547b5c2-kolla-config\") pod \"memcached-0\" (UID: \"c60672b9-d590-48a6-80c0-e3f74547b5c2\") " pod="openstack/memcached-0" Feb 17 09:19:52 crc kubenswrapper[4848]: I0217 09:19:52.251574 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c60672b9-d590-48a6-80c0-e3f74547b5c2-config-data\") pod \"memcached-0\" (UID: \"c60672b9-d590-48a6-80c0-e3f74547b5c2\") " pod="openstack/memcached-0" Feb 17 09:19:52 crc kubenswrapper[4848]: I0217 09:19:52.251655 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfp99\" (UniqueName: \"kubernetes.io/projected/c60672b9-d590-48a6-80c0-e3f74547b5c2-kube-api-access-lfp99\") pod \"memcached-0\" (UID: \"c60672b9-d590-48a6-80c0-e3f74547b5c2\") " pod="openstack/memcached-0" Feb 17 09:19:52 crc kubenswrapper[4848]: I0217 09:19:52.251694 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c60672b9-d590-48a6-80c0-e3f74547b5c2-kolla-config\") pod \"memcached-0\" (UID: \"c60672b9-d590-48a6-80c0-e3f74547b5c2\") " pod="openstack/memcached-0" Feb 17 09:19:52 crc kubenswrapper[4848]: I0217 09:19:52.251738 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c60672b9-d590-48a6-80c0-e3f74547b5c2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c60672b9-d590-48a6-80c0-e3f74547b5c2\") " pod="openstack/memcached-0" Feb 17 09:19:52 crc kubenswrapper[4848]: I0217 09:19:52.251782 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c60672b9-d590-48a6-80c0-e3f74547b5c2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c60672b9-d590-48a6-80c0-e3f74547b5c2\") " pod="openstack/memcached-0" Feb 17 09:19:52 crc kubenswrapper[4848]: I0217 09:19:52.252437 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c60672b9-d590-48a6-80c0-e3f74547b5c2-config-data\") pod \"memcached-0\" (UID: \"c60672b9-d590-48a6-80c0-e3f74547b5c2\") " pod="openstack/memcached-0" Feb 17 09:19:52 crc kubenswrapper[4848]: I0217 09:19:52.252935 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/c60672b9-d590-48a6-80c0-e3f74547b5c2-kolla-config\") pod \"memcached-0\" (UID: \"c60672b9-d590-48a6-80c0-e3f74547b5c2\") " pod="openstack/memcached-0" Feb 17 09:19:52 crc kubenswrapper[4848]: I0217 09:19:52.255190 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/c60672b9-d590-48a6-80c0-e3f74547b5c2-memcached-tls-certs\") pod \"memcached-0\" (UID: \"c60672b9-d590-48a6-80c0-e3f74547b5c2\") " pod="openstack/memcached-0" Feb 17 09:19:52 crc kubenswrapper[4848]: I0217 09:19:52.260656 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c60672b9-d590-48a6-80c0-e3f74547b5c2-combined-ca-bundle\") pod \"memcached-0\" (UID: \"c60672b9-d590-48a6-80c0-e3f74547b5c2\") " pod="openstack/memcached-0" Feb 17 09:19:52 crc kubenswrapper[4848]: I0217 09:19:52.268450 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfp99\" (UniqueName: \"kubernetes.io/projected/c60672b9-d590-48a6-80c0-e3f74547b5c2-kube-api-access-lfp99\") pod \"memcached-0\" (UID: \"c60672b9-d590-48a6-80c0-e3f74547b5c2\") " pod="openstack/memcached-0" Feb 17 09:19:52 crc kubenswrapper[4848]: I0217 09:19:52.357380 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 17 09:19:54 crc kubenswrapper[4848]: I0217 09:19:54.621810 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 09:19:54 crc kubenswrapper[4848]: I0217 09:19:54.625227 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 09:19:54 crc kubenswrapper[4848]: I0217 09:19:54.629110 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-pqtvz" Feb 17 09:19:54 crc kubenswrapper[4848]: I0217 09:19:54.636545 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 09:19:54 crc kubenswrapper[4848]: I0217 09:19:54.792633 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx8mr\" (UniqueName: \"kubernetes.io/projected/7c3fada3-5297-4ddf-ada7-fe24f8a9b17f-kube-api-access-rx8mr\") pod \"kube-state-metrics-0\" (UID: \"7c3fada3-5297-4ddf-ada7-fe24f8a9b17f\") " pod="openstack/kube-state-metrics-0" Feb 17 09:19:54 crc kubenswrapper[4848]: I0217 09:19:54.894429 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx8mr\" (UniqueName: \"kubernetes.io/projected/7c3fada3-5297-4ddf-ada7-fe24f8a9b17f-kube-api-access-rx8mr\") pod \"kube-state-metrics-0\" (UID: \"7c3fada3-5297-4ddf-ada7-fe24f8a9b17f\") " pod="openstack/kube-state-metrics-0" Feb 17 09:19:54 crc kubenswrapper[4848]: I0217 09:19:54.915973 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx8mr\" (UniqueName: \"kubernetes.io/projected/7c3fada3-5297-4ddf-ada7-fe24f8a9b17f-kube-api-access-rx8mr\") pod \"kube-state-metrics-0\" (UID: \"7c3fada3-5297-4ddf-ada7-fe24f8a9b17f\") " pod="openstack/kube-state-metrics-0" Feb 17 09:19:54 crc kubenswrapper[4848]: I0217 09:19:54.955048 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.339359 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-c695f"] Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.343043 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c695f" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.346654 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.346988 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-hwzp5" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.347202 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.349231 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-jbwkv"] Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.351506 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-jbwkv" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.374730 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-c695f"] Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.399560 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-jbwkv"] Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.437811 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e09d3b82-ad17-461a-89eb-b8ee45d4edff-var-lib\") pod \"ovn-controller-ovs-jbwkv\" (UID: \"e09d3b82-ad17-461a-89eb-b8ee45d4edff\") " pod="openstack/ovn-controller-ovs-jbwkv" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.438105 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlxhx\" (UniqueName: \"kubernetes.io/projected/43e80552-f64e-4257-a460-f108ee513c12-kube-api-access-hlxhx\") pod \"ovn-controller-c695f\" (UID: \"43e80552-f64e-4257-a460-f108ee513c12\") " pod="openstack/ovn-controller-c695f" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.438251 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e09d3b82-ad17-461a-89eb-b8ee45d4edff-var-log\") pod \"ovn-controller-ovs-jbwkv\" (UID: \"e09d3b82-ad17-461a-89eb-b8ee45d4edff\") " pod="openstack/ovn-controller-ovs-jbwkv" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.438374 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e80552-f64e-4257-a460-f108ee513c12-combined-ca-bundle\") pod \"ovn-controller-c695f\" (UID: \"43e80552-f64e-4257-a460-f108ee513c12\") " pod="openstack/ovn-controller-c695f" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.438490 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/43e80552-f64e-4257-a460-f108ee513c12-var-run\") pod \"ovn-controller-c695f\" (UID: \"43e80552-f64e-4257-a460-f108ee513c12\") " pod="openstack/ovn-controller-c695f" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.438581 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e09d3b82-ad17-461a-89eb-b8ee45d4edff-var-run\") pod \"ovn-controller-ovs-jbwkv\" (UID: \"e09d3b82-ad17-461a-89eb-b8ee45d4edff\") " pod="openstack/ovn-controller-ovs-jbwkv" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.438674 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e09d3b82-ad17-461a-89eb-b8ee45d4edff-scripts\") pod \"ovn-controller-ovs-jbwkv\" (UID: \"e09d3b82-ad17-461a-89eb-b8ee45d4edff\") " pod="openstack/ovn-controller-ovs-jbwkv" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.438779 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/43e80552-f64e-4257-a460-f108ee513c12-var-run-ovn\") pod \"ovn-controller-c695f\" (UID: \"43e80552-f64e-4257-a460-f108ee513c12\") " pod="openstack/ovn-controller-c695f" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.438875 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/43e80552-f64e-4257-a460-f108ee513c12-var-log-ovn\") pod \"ovn-controller-c695f\" (UID: \"43e80552-f64e-4257-a460-f108ee513c12\") " pod="openstack/ovn-controller-c695f" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.439005 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e09d3b82-ad17-461a-89eb-b8ee45d4edff-etc-ovs\") pod \"ovn-controller-ovs-jbwkv\" (UID: \"e09d3b82-ad17-461a-89eb-b8ee45d4edff\") " pod="openstack/ovn-controller-ovs-jbwkv" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.439098 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e80552-f64e-4257-a460-f108ee513c12-ovn-controller-tls-certs\") pod \"ovn-controller-c695f\" (UID: \"43e80552-f64e-4257-a460-f108ee513c12\") " pod="openstack/ovn-controller-c695f" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.439198 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vdvg\" (UniqueName: \"kubernetes.io/projected/e09d3b82-ad17-461a-89eb-b8ee45d4edff-kube-api-access-6vdvg\") pod \"ovn-controller-ovs-jbwkv\" (UID: \"e09d3b82-ad17-461a-89eb-b8ee45d4edff\") " pod="openstack/ovn-controller-ovs-jbwkv" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.439358 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43e80552-f64e-4257-a460-f108ee513c12-scripts\") pod \"ovn-controller-c695f\" (UID: \"43e80552-f64e-4257-a460-f108ee513c12\") " pod="openstack/ovn-controller-c695f" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.541084 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e09d3b82-ad17-461a-89eb-b8ee45d4edff-etc-ovs\") pod \"ovn-controller-ovs-jbwkv\" (UID: \"e09d3b82-ad17-461a-89eb-b8ee45d4edff\") " pod="openstack/ovn-controller-ovs-jbwkv" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.541160 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e80552-f64e-4257-a460-f108ee513c12-ovn-controller-tls-certs\") pod \"ovn-controller-c695f\" (UID: \"43e80552-f64e-4257-a460-f108ee513c12\") " pod="openstack/ovn-controller-c695f" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.541187 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vdvg\" (UniqueName: \"kubernetes.io/projected/e09d3b82-ad17-461a-89eb-b8ee45d4edff-kube-api-access-6vdvg\") pod \"ovn-controller-ovs-jbwkv\" (UID: \"e09d3b82-ad17-461a-89eb-b8ee45d4edff\") " pod="openstack/ovn-controller-ovs-jbwkv" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.541228 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43e80552-f64e-4257-a460-f108ee513c12-scripts\") pod \"ovn-controller-c695f\" (UID: \"43e80552-f64e-4257-a460-f108ee513c12\") " pod="openstack/ovn-controller-c695f" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.541291 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e09d3b82-ad17-461a-89eb-b8ee45d4edff-var-lib\") pod \"ovn-controller-ovs-jbwkv\" (UID: \"e09d3b82-ad17-461a-89eb-b8ee45d4edff\") " pod="openstack/ovn-controller-ovs-jbwkv" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.541313 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlxhx\" (UniqueName: \"kubernetes.io/projected/43e80552-f64e-4257-a460-f108ee513c12-kube-api-access-hlxhx\") pod \"ovn-controller-c695f\" (UID: \"43e80552-f64e-4257-a460-f108ee513c12\") " pod="openstack/ovn-controller-c695f" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.541346 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e09d3b82-ad17-461a-89eb-b8ee45d4edff-var-log\") pod \"ovn-controller-ovs-jbwkv\" (UID: \"e09d3b82-ad17-461a-89eb-b8ee45d4edff\") " pod="openstack/ovn-controller-ovs-jbwkv" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.541373 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e80552-f64e-4257-a460-f108ee513c12-combined-ca-bundle\") pod \"ovn-controller-c695f\" (UID: \"43e80552-f64e-4257-a460-f108ee513c12\") " pod="openstack/ovn-controller-c695f" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.541401 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/43e80552-f64e-4257-a460-f108ee513c12-var-run\") pod \"ovn-controller-c695f\" (UID: \"43e80552-f64e-4257-a460-f108ee513c12\") " pod="openstack/ovn-controller-c695f" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.541424 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e09d3b82-ad17-461a-89eb-b8ee45d4edff-var-run\") pod \"ovn-controller-ovs-jbwkv\" (UID: \"e09d3b82-ad17-461a-89eb-b8ee45d4edff\") " pod="openstack/ovn-controller-ovs-jbwkv" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.541469 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e09d3b82-ad17-461a-89eb-b8ee45d4edff-scripts\") pod \"ovn-controller-ovs-jbwkv\" (UID: \"e09d3b82-ad17-461a-89eb-b8ee45d4edff\") " pod="openstack/ovn-controller-ovs-jbwkv" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.541497 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/43e80552-f64e-4257-a460-f108ee513c12-var-run-ovn\") pod \"ovn-controller-c695f\" (UID: \"43e80552-f64e-4257-a460-f108ee513c12\") " pod="openstack/ovn-controller-c695f" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.541516 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/43e80552-f64e-4257-a460-f108ee513c12-var-log-ovn\") pod \"ovn-controller-c695f\" (UID: \"43e80552-f64e-4257-a460-f108ee513c12\") " pod="openstack/ovn-controller-c695f" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.542135 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/e09d3b82-ad17-461a-89eb-b8ee45d4edff-etc-ovs\") pod \"ovn-controller-ovs-jbwkv\" (UID: \"e09d3b82-ad17-461a-89eb-b8ee45d4edff\") " pod="openstack/ovn-controller-ovs-jbwkv" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.542281 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/43e80552-f64e-4257-a460-f108ee513c12-var-log-ovn\") pod \"ovn-controller-c695f\" (UID: \"43e80552-f64e-4257-a460-f108ee513c12\") " pod="openstack/ovn-controller-c695f" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.542325 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e09d3b82-ad17-461a-89eb-b8ee45d4edff-var-run\") pod \"ovn-controller-ovs-jbwkv\" (UID: \"e09d3b82-ad17-461a-89eb-b8ee45d4edff\") " pod="openstack/ovn-controller-ovs-jbwkv" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.542389 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/43e80552-f64e-4257-a460-f108ee513c12-var-run\") pod \"ovn-controller-c695f\" (UID: \"43e80552-f64e-4257-a460-f108ee513c12\") " pod="openstack/ovn-controller-c695f" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.542403 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/43e80552-f64e-4257-a460-f108ee513c12-var-run-ovn\") pod \"ovn-controller-c695f\" (UID: \"43e80552-f64e-4257-a460-f108ee513c12\") " pod="openstack/ovn-controller-c695f" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.542571 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/e09d3b82-ad17-461a-89eb-b8ee45d4edff-var-lib\") pod \"ovn-controller-ovs-jbwkv\" (UID: \"e09d3b82-ad17-461a-89eb-b8ee45d4edff\") " pod="openstack/ovn-controller-ovs-jbwkv" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.542679 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/e09d3b82-ad17-461a-89eb-b8ee45d4edff-var-log\") pod \"ovn-controller-ovs-jbwkv\" (UID: \"e09d3b82-ad17-461a-89eb-b8ee45d4edff\") " pod="openstack/ovn-controller-ovs-jbwkv" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.545980 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/43e80552-f64e-4257-a460-f108ee513c12-scripts\") pod \"ovn-controller-c695f\" (UID: \"43e80552-f64e-4257-a460-f108ee513c12\") " pod="openstack/ovn-controller-c695f" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.547263 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/43e80552-f64e-4257-a460-f108ee513c12-ovn-controller-tls-certs\") pod \"ovn-controller-c695f\" (UID: \"43e80552-f64e-4257-a460-f108ee513c12\") " pod="openstack/ovn-controller-c695f" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.548390 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e09d3b82-ad17-461a-89eb-b8ee45d4edff-scripts\") pod \"ovn-controller-ovs-jbwkv\" (UID: \"e09d3b82-ad17-461a-89eb-b8ee45d4edff\") " pod="openstack/ovn-controller-ovs-jbwkv" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.549376 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43e80552-f64e-4257-a460-f108ee513c12-combined-ca-bundle\") pod \"ovn-controller-c695f\" (UID: \"43e80552-f64e-4257-a460-f108ee513c12\") " pod="openstack/ovn-controller-c695f" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.560358 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlxhx\" (UniqueName: \"kubernetes.io/projected/43e80552-f64e-4257-a460-f108ee513c12-kube-api-access-hlxhx\") pod \"ovn-controller-c695f\" (UID: \"43e80552-f64e-4257-a460-f108ee513c12\") " pod="openstack/ovn-controller-c695f" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.564361 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vdvg\" (UniqueName: \"kubernetes.io/projected/e09d3b82-ad17-461a-89eb-b8ee45d4edff-kube-api-access-6vdvg\") pod \"ovn-controller-ovs-jbwkv\" (UID: \"e09d3b82-ad17-461a-89eb-b8ee45d4edff\") " pod="openstack/ovn-controller-ovs-jbwkv" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.660031 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.661378 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.667539 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-plntx" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.668282 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.668443 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.668536 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.668602 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.679001 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c695f" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.688209 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.694314 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-jbwkv" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.745138 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9ccc9c6f-4e19-464f-9e06-7a3951c63c85-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9ccc9c6f-4e19-464f-9e06-7a3951c63c85\") " pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.745176 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2t8k\" (UniqueName: \"kubernetes.io/projected/9ccc9c6f-4e19-464f-9e06-7a3951c63c85-kube-api-access-z2t8k\") pod \"ovsdbserver-nb-0\" (UID: \"9ccc9c6f-4e19-464f-9e06-7a3951c63c85\") " pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.745238 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ccc9c6f-4e19-464f-9e06-7a3951c63c85-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9ccc9c6f-4e19-464f-9e06-7a3951c63c85\") " pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.745307 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ccc9c6f-4e19-464f-9e06-7a3951c63c85-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9ccc9c6f-4e19-464f-9e06-7a3951c63c85\") " pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.745360 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ccc9c6f-4e19-464f-9e06-7a3951c63c85-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9ccc9c6f-4e19-464f-9e06-7a3951c63c85\") " pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.745392 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9ccc9c6f-4e19-464f-9e06-7a3951c63c85\") " pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.745449 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ccc9c6f-4e19-464f-9e06-7a3951c63c85-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9ccc9c6f-4e19-464f-9e06-7a3951c63c85\") " pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.745488 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ccc9c6f-4e19-464f-9e06-7a3951c63c85-config\") pod \"ovsdbserver-nb-0\" (UID: \"9ccc9c6f-4e19-464f-9e06-7a3951c63c85\") " pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.846740 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9ccc9c6f-4e19-464f-9e06-7a3951c63c85\") " pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.846833 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ccc9c6f-4e19-464f-9e06-7a3951c63c85-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9ccc9c6f-4e19-464f-9e06-7a3951c63c85\") " pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.846866 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ccc9c6f-4e19-464f-9e06-7a3951c63c85-config\") pod \"ovsdbserver-nb-0\" (UID: \"9ccc9c6f-4e19-464f-9e06-7a3951c63c85\") " pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.846886 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9ccc9c6f-4e19-464f-9e06-7a3951c63c85-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9ccc9c6f-4e19-464f-9e06-7a3951c63c85\") " pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.846906 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2t8k\" (UniqueName: \"kubernetes.io/projected/9ccc9c6f-4e19-464f-9e06-7a3951c63c85-kube-api-access-z2t8k\") pod \"ovsdbserver-nb-0\" (UID: \"9ccc9c6f-4e19-464f-9e06-7a3951c63c85\") " pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.846924 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ccc9c6f-4e19-464f-9e06-7a3951c63c85-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9ccc9c6f-4e19-464f-9e06-7a3951c63c85\") " pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.846963 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ccc9c6f-4e19-464f-9e06-7a3951c63c85-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9ccc9c6f-4e19-464f-9e06-7a3951c63c85\") " pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.846998 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ccc9c6f-4e19-464f-9e06-7a3951c63c85-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9ccc9c6f-4e19-464f-9e06-7a3951c63c85\") " pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.847745 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9ccc9c6f-4e19-464f-9e06-7a3951c63c85\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.848219 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/9ccc9c6f-4e19-464f-9e06-7a3951c63c85-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"9ccc9c6f-4e19-464f-9e06-7a3951c63c85\") " pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.848572 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ccc9c6f-4e19-464f-9e06-7a3951c63c85-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"9ccc9c6f-4e19-464f-9e06-7a3951c63c85\") " pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.848676 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ccc9c6f-4e19-464f-9e06-7a3951c63c85-config\") pod \"ovsdbserver-nb-0\" (UID: \"9ccc9c6f-4e19-464f-9e06-7a3951c63c85\") " pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.850890 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ccc9c6f-4e19-464f-9e06-7a3951c63c85-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9ccc9c6f-4e19-464f-9e06-7a3951c63c85\") " pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.864516 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/9ccc9c6f-4e19-464f-9e06-7a3951c63c85-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"9ccc9c6f-4e19-464f-9e06-7a3951c63c85\") " pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.864628 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ccc9c6f-4e19-464f-9e06-7a3951c63c85-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"9ccc9c6f-4e19-464f-9e06-7a3951c63c85\") " pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.867188 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2t8k\" (UniqueName: \"kubernetes.io/projected/9ccc9c6f-4e19-464f-9e06-7a3951c63c85-kube-api-access-z2t8k\") pod \"ovsdbserver-nb-0\" (UID: \"9ccc9c6f-4e19-464f-9e06-7a3951c63c85\") " pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.871888 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-nb-0\" (UID: \"9ccc9c6f-4e19-464f-9e06-7a3951c63c85\") " pod="openstack/ovsdbserver-nb-0" Feb 17 09:19:57 crc kubenswrapper[4848]: I0217 09:19:57.991632 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.317875 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.319212 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.379413 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.379538 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.379574 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-z54lr" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.379619 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.404700 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.407315 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0c03c6cc-b85f-465f-b692-8f50eaca7cd6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0c03c6cc-b85f-465f-b692-8f50eaca7cd6\") " pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.407354 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c03c6cc-b85f-465f-b692-8f50eaca7cd6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0c03c6cc-b85f-465f-b692-8f50eaca7cd6\") " pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.407373 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t64hz\" (UniqueName: \"kubernetes.io/projected/0c03c6cc-b85f-465f-b692-8f50eaca7cd6-kube-api-access-t64hz\") pod \"ovsdbserver-sb-0\" (UID: \"0c03c6cc-b85f-465f-b692-8f50eaca7cd6\") " pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.407394 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c03c6cc-b85f-465f-b692-8f50eaca7cd6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0c03c6cc-b85f-465f-b692-8f50eaca7cd6\") " pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.407416 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0c03c6cc-b85f-465f-b692-8f50eaca7cd6\") " pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.407450 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c03c6cc-b85f-465f-b692-8f50eaca7cd6-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0c03c6cc-b85f-465f-b692-8f50eaca7cd6\") " pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.407473 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c03c6cc-b85f-465f-b692-8f50eaca7cd6-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0c03c6cc-b85f-465f-b692-8f50eaca7cd6\") " pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.407794 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c03c6cc-b85f-465f-b692-8f50eaca7cd6-config\") pod \"ovsdbserver-sb-0\" (UID: \"0c03c6cc-b85f-465f-b692-8f50eaca7cd6\") " pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.509489 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c03c6cc-b85f-465f-b692-8f50eaca7cd6-config\") pod \"ovsdbserver-sb-0\" (UID: \"0c03c6cc-b85f-465f-b692-8f50eaca7cd6\") " pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.509546 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0c03c6cc-b85f-465f-b692-8f50eaca7cd6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0c03c6cc-b85f-465f-b692-8f50eaca7cd6\") " pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.509574 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c03c6cc-b85f-465f-b692-8f50eaca7cd6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0c03c6cc-b85f-465f-b692-8f50eaca7cd6\") " pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.509591 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t64hz\" (UniqueName: \"kubernetes.io/projected/0c03c6cc-b85f-465f-b692-8f50eaca7cd6-kube-api-access-t64hz\") pod \"ovsdbserver-sb-0\" (UID: \"0c03c6cc-b85f-465f-b692-8f50eaca7cd6\") " pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.509610 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c03c6cc-b85f-465f-b692-8f50eaca7cd6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0c03c6cc-b85f-465f-b692-8f50eaca7cd6\") " pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.509637 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0c03c6cc-b85f-465f-b692-8f50eaca7cd6\") " pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.509674 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c03c6cc-b85f-465f-b692-8f50eaca7cd6-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0c03c6cc-b85f-465f-b692-8f50eaca7cd6\") " pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.509699 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c03c6cc-b85f-465f-b692-8f50eaca7cd6-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0c03c6cc-b85f-465f-b692-8f50eaca7cd6\") " pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.509988 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0c03c6cc-b85f-465f-b692-8f50eaca7cd6\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.510509 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0c03c6cc-b85f-465f-b692-8f50eaca7cd6-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"0c03c6cc-b85f-465f-b692-8f50eaca7cd6\") " pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.511470 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0c03c6cc-b85f-465f-b692-8f50eaca7cd6-config\") pod \"ovsdbserver-sb-0\" (UID: \"0c03c6cc-b85f-465f-b692-8f50eaca7cd6\") " pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.511955 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0c03c6cc-b85f-465f-b692-8f50eaca7cd6-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"0c03c6cc-b85f-465f-b692-8f50eaca7cd6\") " pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.518023 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c03c6cc-b85f-465f-b692-8f50eaca7cd6-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"0c03c6cc-b85f-465f-b692-8f50eaca7cd6\") " pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.519215 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c03c6cc-b85f-465f-b692-8f50eaca7cd6-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0c03c6cc-b85f-465f-b692-8f50eaca7cd6\") " pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.523666 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/0c03c6cc-b85f-465f-b692-8f50eaca7cd6-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"0c03c6cc-b85f-465f-b692-8f50eaca7cd6\") " pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.529459 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t64hz\" (UniqueName: \"kubernetes.io/projected/0c03c6cc-b85f-465f-b692-8f50eaca7cd6-kube-api-access-t64hz\") pod \"ovsdbserver-sb-0\" (UID: \"0c03c6cc-b85f-465f-b692-8f50eaca7cd6\") " pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.537001 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"ovsdbserver-sb-0\" (UID: \"0c03c6cc-b85f-465f-b692-8f50eaca7cd6\") " pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: E0217 09:20:01.643677 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 17 09:20:01 crc kubenswrapper[4848]: E0217 09:20:01.644112 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qn62j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-855cbc58c5-v8qhl_openstack(6446f1e4-f3de-4729-a5b9-f1e9974bfc67): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 09:20:01 crc kubenswrapper[4848]: E0217 09:20:01.645390 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-855cbc58c5-v8qhl" podUID="6446f1e4-f3de-4729-a5b9-f1e9974bfc67" Feb 17 09:20:01 crc kubenswrapper[4848]: I0217 09:20:01.691623 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:01 crc kubenswrapper[4848]: E0217 09:20:01.739669 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2" Feb 17 09:20:01 crc kubenswrapper[4848]: E0217 09:20:01.740000 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server@sha256:f391b842000dadaeb692eb6b5e845c2aa3125ef24679fbb4af2c8b98252de4b2,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hhpjr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-6fcf94d689-mx8rm_openstack(04fea754-2f62-4c33-943e-e602f4875d21): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 09:20:01 crc kubenswrapper[4848]: E0217 09:20:01.741203 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-6fcf94d689-mx8rm" podUID="04fea754-2f62-4c33-943e-e602f4875d21" Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.187474 4848 generic.go:334] "Generic (PLEG): container finished" podID="84f8e59b-6699-4b32-8772-d9347fd21259" containerID="f56f15d61fe7db37060d2e564594fc7ed18b336bb021a7204f188d4b1fc14305" exitCode=0 Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.187554 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-7lgm2" event={"ID":"84f8e59b-6699-4b32-8772-d9347fd21259","Type":"ContainerDied","Data":"f56f15d61fe7db37060d2e564594fc7ed18b336bb021a7204f188d4b1fc14305"} Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.189466 4848 generic.go:334] "Generic (PLEG): container finished" podID="81f878e8-a78a-43d8-8f54-e10d5393335d" containerID="7f5d4549a728921c6361d27e4f4eff5ad2b0b228823a2baf286d084654b790b9" exitCode=0 Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.189879 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-d4ktk" event={"ID":"81f878e8-a78a-43d8-8f54-e10d5393335d","Type":"ContainerDied","Data":"7f5d4549a728921c6361d27e4f4eff5ad2b0b228823a2baf286d084654b790b9"} Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.330805 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.338078 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.343354 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.388507 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 17 09:20:02 crc kubenswrapper[4848]: W0217 09:20:02.400498 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac51f8f5_cf36_44ef_b849_9bd6265e5156.slice/crio-f3cfbbaa8e004bcae3a66e808e45046c65707da8ac02f613202c4645f2dd9ae2 WatchSource:0}: Error finding container f3cfbbaa8e004bcae3a66e808e45046c65707da8ac02f613202c4645f2dd9ae2: Status 404 returned error can't find the container with id f3cfbbaa8e004bcae3a66e808e45046c65707da8ac02f613202c4645f2dd9ae2 Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.430915 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.439031 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.540284 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.606681 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-c695f"] Feb 17 09:20:02 crc kubenswrapper[4848]: W0217 09:20:02.618615 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43e80552_f64e_4257_a460_f108ee513c12.slice/crio-6c2890ae0556898c1574190897533badcf7480e3028dfe66ca288654722de5c3 WatchSource:0}: Error finding container 6c2890ae0556898c1574190897533badcf7480e3028dfe66ca288654722de5c3: Status 404 returned error can't find the container with id 6c2890ae0556898c1574190897533badcf7480e3028dfe66ca288654722de5c3 Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.621399 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 17 09:20:02 crc kubenswrapper[4848]: W0217 09:20:02.621815 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c03c6cc_b85f_465f_b692_8f50eaca7cd6.slice/crio-c2ed54833c151526abba4496fa0b72c632b4e86165f5546bbcdb77b67e642933 WatchSource:0}: Error finding container c2ed54833c151526abba4496fa0b72c632b4e86165f5546bbcdb77b67e642933: Status 404 returned error can't find the container with id c2ed54833c151526abba4496fa0b72c632b4e86165f5546bbcdb77b67e642933 Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.679501 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-v8qhl" Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.688850 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-mx8rm" Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.751112 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04fea754-2f62-4c33-943e-e602f4875d21-config\") pod \"04fea754-2f62-4c33-943e-e602f4875d21\" (UID: \"04fea754-2f62-4c33-943e-e602f4875d21\") " Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.751171 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn62j\" (UniqueName: \"kubernetes.io/projected/6446f1e4-f3de-4729-a5b9-f1e9974bfc67-kube-api-access-qn62j\") pod \"6446f1e4-f3de-4729-a5b9-f1e9974bfc67\" (UID: \"6446f1e4-f3de-4729-a5b9-f1e9974bfc67\") " Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.751343 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhpjr\" (UniqueName: \"kubernetes.io/projected/04fea754-2f62-4c33-943e-e602f4875d21-kube-api-access-hhpjr\") pod \"04fea754-2f62-4c33-943e-e602f4875d21\" (UID: \"04fea754-2f62-4c33-943e-e602f4875d21\") " Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.751384 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04fea754-2f62-4c33-943e-e602f4875d21-dns-svc\") pod \"04fea754-2f62-4c33-943e-e602f4875d21\" (UID: \"04fea754-2f62-4c33-943e-e602f4875d21\") " Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.751423 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6446f1e4-f3de-4729-a5b9-f1e9974bfc67-config\") pod \"6446f1e4-f3de-4729-a5b9-f1e9974bfc67\" (UID: \"6446f1e4-f3de-4729-a5b9-f1e9974bfc67\") " Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.751561 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04fea754-2f62-4c33-943e-e602f4875d21-config" (OuterVolumeSpecName: "config") pod "04fea754-2f62-4c33-943e-e602f4875d21" (UID: "04fea754-2f62-4c33-943e-e602f4875d21"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.751975 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04fea754-2f62-4c33-943e-e602f4875d21-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.752279 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04fea754-2f62-4c33-943e-e602f4875d21-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "04fea754-2f62-4c33-943e-e602f4875d21" (UID: "04fea754-2f62-4c33-943e-e602f4875d21"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.752460 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6446f1e4-f3de-4729-a5b9-f1e9974bfc67-config" (OuterVolumeSpecName: "config") pod "6446f1e4-f3de-4729-a5b9-f1e9974bfc67" (UID: "6446f1e4-f3de-4729-a5b9-f1e9974bfc67"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.755990 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6446f1e4-f3de-4729-a5b9-f1e9974bfc67-kube-api-access-qn62j" (OuterVolumeSpecName: "kube-api-access-qn62j") pod "6446f1e4-f3de-4729-a5b9-f1e9974bfc67" (UID: "6446f1e4-f3de-4729-a5b9-f1e9974bfc67"). InnerVolumeSpecName "kube-api-access-qn62j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.756888 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04fea754-2f62-4c33-943e-e602f4875d21-kube-api-access-hhpjr" (OuterVolumeSpecName: "kube-api-access-hhpjr") pod "04fea754-2f62-4c33-943e-e602f4875d21" (UID: "04fea754-2f62-4c33-943e-e602f4875d21"). InnerVolumeSpecName "kube-api-access-hhpjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.852961 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhpjr\" (UniqueName: \"kubernetes.io/projected/04fea754-2f62-4c33-943e-e602f4875d21-kube-api-access-hhpjr\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.853354 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04fea754-2f62-4c33-943e-e602f4875d21-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.853909 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6446f1e4-f3de-4729-a5b9-f1e9974bfc67-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:02 crc kubenswrapper[4848]: I0217 09:20:02.854010 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qn62j\" (UniqueName: \"kubernetes.io/projected/6446f1e4-f3de-4729-a5b9-f1e9974bfc67-kube-api-access-qn62j\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.179457 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-jbwkv"] Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.208642 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-d4ktk" event={"ID":"81f878e8-a78a-43d8-8f54-e10d5393335d","Type":"ContainerStarted","Data":"752733292aad205f95560236771e4bbe8e8603632364e2d745f4a158532f9ef4"} Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.208782 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f54874ffc-d4ktk" Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.210937 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9ccc9c6f-4e19-464f-9e06-7a3951c63c85","Type":"ContainerStarted","Data":"f066ce1a998220407967ce670499686cdcc62a08aff773d8f60d3ed0e50ecf00"} Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.213069 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"241bdede-0e36-4cfa-965b-89449d5f84f0","Type":"ContainerStarted","Data":"2ed395d9b43927ff4f52bb804883e0d3f7ffcc1aef19da472b141289cc08ddfb"} Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.215213 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c60672b9-d590-48a6-80c0-e3f74547b5c2","Type":"ContainerStarted","Data":"e3b0a179c55aef2c67cb85782529efa9c86d854c3089e79d629b2d4df417d3a3"} Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.216269 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"db50eaa9-ca0a-4a83-98d8-fce82f849d91","Type":"ContainerStarted","Data":"3dfcbb0c9f17996275624b0f8bf1c1f2611bee778fca02953895dcb785168cee"} Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.218285 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6fcf94d689-mx8rm" Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.218356 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6fcf94d689-mx8rm" event={"ID":"04fea754-2f62-4c33-943e-e602f4875d21","Type":"ContainerDied","Data":"37cefdbf0333384eaa9582e2c12d42c9c4e30a21b5d00f852ae603490b8b25f5"} Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.219615 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7c3fada3-5297-4ddf-ada7-fe24f8a9b17f","Type":"ContainerStarted","Data":"94b1a33d43ba1ca534e518c6355ec088dc2b1e175067667569de4ce16f882fef"} Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.220686 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c695f" event={"ID":"43e80552-f64e-4257-a460-f108ee513c12","Type":"ContainerStarted","Data":"6c2890ae0556898c1574190897533badcf7480e3028dfe66ca288654722de5c3"} Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.221827 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ac51f8f5-cf36-44ef-b849-9bd6265e5156","Type":"ContainerStarted","Data":"f3cfbbaa8e004bcae3a66e808e45046c65707da8ac02f613202c4645f2dd9ae2"} Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.223386 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-855cbc58c5-v8qhl" event={"ID":"6446f1e4-f3de-4729-a5b9-f1e9974bfc67","Type":"ContainerDied","Data":"79ff1d96e8fdcb4aaa7a8f65f2be6f3de4a9bda9acfb46fa2478a5e6f37241e9"} Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.223506 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-855cbc58c5-v8qhl" Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.226852 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0c03c6cc-b85f-465f-b692-8f50eaca7cd6","Type":"ContainerStarted","Data":"c2ed54833c151526abba4496fa0b72c632b4e86165f5546bbcdb77b67e642933"} Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.233850 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-7lgm2" event={"ID":"84f8e59b-6699-4b32-8772-d9347fd21259","Type":"ContainerStarted","Data":"5ef7f25940670c927395f21a3af49da52d6a2cd3cf0f6c7a25327fc3127e9ac1"} Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.234077 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67ff45466c-7lgm2" Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.235427 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd7e9b9b-99f0-4720-b997-3f00996972e5","Type":"ContainerStarted","Data":"c5f0ab0653b4b772a4a2cbfa5453ba966069e6e9f42663253bb2f5af996474ac"} Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.236897 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f54874ffc-d4ktk" podStartSLOduration=2.985383317 podStartE2EDuration="16.236885565s" podCreationTimestamp="2026-02-17 09:19:47 +0000 UTC" firstStartedPulling="2026-02-17 09:19:48.552289732 +0000 UTC m=+866.095545378" lastFinishedPulling="2026-02-17 09:20:01.80379198 +0000 UTC m=+879.347047626" observedRunningTime="2026-02-17 09:20:03.229742715 +0000 UTC m=+880.772998351" watchObservedRunningTime="2026-02-17 09:20:03.236885565 +0000 UTC m=+880.780141211" Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.257098 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67ff45466c-7lgm2" podStartSLOduration=2.40798229 podStartE2EDuration="15.257080248s" podCreationTimestamp="2026-02-17 09:19:48 +0000 UTC" firstStartedPulling="2026-02-17 09:19:48.95633579 +0000 UTC m=+866.499591436" lastFinishedPulling="2026-02-17 09:20:01.805433718 +0000 UTC m=+879.348689394" observedRunningTime="2026-02-17 09:20:03.253239175 +0000 UTC m=+880.796494821" watchObservedRunningTime="2026-02-17 09:20:03.257080248 +0000 UTC m=+880.800335894" Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.319810 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-v8qhl"] Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.338209 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-855cbc58c5-v8qhl"] Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.370781 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-mx8rm"] Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.376031 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6fcf94d689-mx8rm"] Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.396717 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04fea754-2f62-4c33-943e-e602f4875d21" path="/var/lib/kubelet/pods/04fea754-2f62-4c33-943e-e602f4875d21/volumes" Feb 17 09:20:03 crc kubenswrapper[4848]: I0217 09:20:03.397092 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6446f1e4-f3de-4729-a5b9-f1e9974bfc67" path="/var/lib/kubelet/pods/6446f1e4-f3de-4729-a5b9-f1e9974bfc67/volumes" Feb 17 09:20:04 crc kubenswrapper[4848]: I0217 09:20:04.254160 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jbwkv" event={"ID":"e09d3b82-ad17-461a-89eb-b8ee45d4edff","Type":"ContainerStarted","Data":"2f7546ff5c0c3bd7b061a6e9202f01825f66ccea54863b5b86f6400a0fd65020"} Feb 17 09:20:08 crc kubenswrapper[4848]: I0217 09:20:08.080581 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f54874ffc-d4ktk" Feb 17 09:20:08 crc kubenswrapper[4848]: I0217 09:20:08.399588 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67ff45466c-7lgm2" Feb 17 09:20:08 crc kubenswrapper[4848]: I0217 09:20:08.451952 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-d4ktk"] Feb 17 09:20:08 crc kubenswrapper[4848]: I0217 09:20:08.452161 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f54874ffc-d4ktk" podUID="81f878e8-a78a-43d8-8f54-e10d5393335d" containerName="dnsmasq-dns" containerID="cri-o://752733292aad205f95560236771e4bbe8e8603632364e2d745f4a158532f9ef4" gracePeriod=10 Feb 17 09:20:09 crc kubenswrapper[4848]: I0217 09:20:09.296613 4848 generic.go:334] "Generic (PLEG): container finished" podID="81f878e8-a78a-43d8-8f54-e10d5393335d" containerID="752733292aad205f95560236771e4bbe8e8603632364e2d745f4a158532f9ef4" exitCode=0 Feb 17 09:20:09 crc kubenswrapper[4848]: I0217 09:20:09.296978 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-d4ktk" event={"ID":"81f878e8-a78a-43d8-8f54-e10d5393335d","Type":"ContainerDied","Data":"752733292aad205f95560236771e4bbe8e8603632364e2d745f4a158532f9ef4"} Feb 17 09:20:10 crc kubenswrapper[4848]: I0217 09:20:10.289979 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-d4ktk" Feb 17 09:20:10 crc kubenswrapper[4848]: I0217 09:20:10.318896 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f54874ffc-d4ktk" event={"ID":"81f878e8-a78a-43d8-8f54-e10d5393335d","Type":"ContainerDied","Data":"4f69e345f8e20d3a9b0e97714253418b57d4a006cac6a97b01f82b85f5c69ce9"} Feb 17 09:20:10 crc kubenswrapper[4848]: I0217 09:20:10.318970 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f54874ffc-d4ktk" Feb 17 09:20:10 crc kubenswrapper[4848]: I0217 09:20:10.318978 4848 scope.go:117] "RemoveContainer" containerID="752733292aad205f95560236771e4bbe8e8603632364e2d745f4a158532f9ef4" Feb 17 09:20:10 crc kubenswrapper[4848]: I0217 09:20:10.374885 4848 scope.go:117] "RemoveContainer" containerID="7f5d4549a728921c6361d27e4f4eff5ad2b0b228823a2baf286d084654b790b9" Feb 17 09:20:10 crc kubenswrapper[4848]: I0217 09:20:10.397911 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgx22\" (UniqueName: \"kubernetes.io/projected/81f878e8-a78a-43d8-8f54-e10d5393335d-kube-api-access-fgx22\") pod \"81f878e8-a78a-43d8-8f54-e10d5393335d\" (UID: \"81f878e8-a78a-43d8-8f54-e10d5393335d\") " Feb 17 09:20:10 crc kubenswrapper[4848]: I0217 09:20:10.398012 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81f878e8-a78a-43d8-8f54-e10d5393335d-config\") pod \"81f878e8-a78a-43d8-8f54-e10d5393335d\" (UID: \"81f878e8-a78a-43d8-8f54-e10d5393335d\") " Feb 17 09:20:10 crc kubenswrapper[4848]: I0217 09:20:10.398119 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81f878e8-a78a-43d8-8f54-e10d5393335d-dns-svc\") pod \"81f878e8-a78a-43d8-8f54-e10d5393335d\" (UID: \"81f878e8-a78a-43d8-8f54-e10d5393335d\") " Feb 17 09:20:10 crc kubenswrapper[4848]: I0217 09:20:10.402689 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81f878e8-a78a-43d8-8f54-e10d5393335d-kube-api-access-fgx22" (OuterVolumeSpecName: "kube-api-access-fgx22") pod "81f878e8-a78a-43d8-8f54-e10d5393335d" (UID: "81f878e8-a78a-43d8-8f54-e10d5393335d"). InnerVolumeSpecName "kube-api-access-fgx22". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:20:10 crc kubenswrapper[4848]: I0217 09:20:10.437939 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81f878e8-a78a-43d8-8f54-e10d5393335d-config" (OuterVolumeSpecName: "config") pod "81f878e8-a78a-43d8-8f54-e10d5393335d" (UID: "81f878e8-a78a-43d8-8f54-e10d5393335d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:10 crc kubenswrapper[4848]: I0217 09:20:10.443749 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81f878e8-a78a-43d8-8f54-e10d5393335d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "81f878e8-a78a-43d8-8f54-e10d5393335d" (UID: "81f878e8-a78a-43d8-8f54-e10d5393335d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:10 crc kubenswrapper[4848]: I0217 09:20:10.499696 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/81f878e8-a78a-43d8-8f54-e10d5393335d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:10 crc kubenswrapper[4848]: I0217 09:20:10.499737 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgx22\" (UniqueName: \"kubernetes.io/projected/81f878e8-a78a-43d8-8f54-e10d5393335d-kube-api-access-fgx22\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:10 crc kubenswrapper[4848]: I0217 09:20:10.499752 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81f878e8-a78a-43d8-8f54-e10d5393335d-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:10 crc kubenswrapper[4848]: I0217 09:20:10.652109 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-d4ktk"] Feb 17 09:20:10 crc kubenswrapper[4848]: I0217 09:20:10.657925 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f54874ffc-d4ktk"] Feb 17 09:20:11 crc kubenswrapper[4848]: I0217 09:20:11.043384 4848 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 09:20:11 crc kubenswrapper[4848]: I0217 09:20:11.326360 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0c03c6cc-b85f-465f-b692-8f50eaca7cd6","Type":"ContainerStarted","Data":"25f7631768da88c817553f90377173191c813c78e33636459d103df33e5df686"} Feb 17 09:20:11 crc kubenswrapper[4848]: I0217 09:20:11.327530 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7c3fada3-5297-4ddf-ada7-fe24f8a9b17f","Type":"ContainerStarted","Data":"ee17a2483d756e7df0546e9312b07996ed341cd5cc9dff67706090878579b4f7"} Feb 17 09:20:11 crc kubenswrapper[4848]: I0217 09:20:11.328451 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 17 09:20:11 crc kubenswrapper[4848]: I0217 09:20:11.330292 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9ccc9c6f-4e19-464f-9e06-7a3951c63c85","Type":"ContainerStarted","Data":"1cfb224eac622be75a8f8494bc55275ad666b1d798bd3725f7c7d59ac62eeb23"} Feb 17 09:20:11 crc kubenswrapper[4848]: I0217 09:20:11.331933 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"c60672b9-d590-48a6-80c0-e3f74547b5c2","Type":"ContainerStarted","Data":"baa5e97d455f786772807a5edc06114d1cb2939048a85b6e7cc615a092a39765"} Feb 17 09:20:11 crc kubenswrapper[4848]: I0217 09:20:11.332581 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 17 09:20:11 crc kubenswrapper[4848]: I0217 09:20:11.334174 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ac51f8f5-cf36-44ef-b849-9bd6265e5156","Type":"ContainerStarted","Data":"0c492a1e86b27f36b464427560457cd1a38339d7c50f0734b01ba27cb955c771"} Feb 17 09:20:11 crc kubenswrapper[4848]: I0217 09:20:11.335916 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c695f" event={"ID":"43e80552-f64e-4257-a460-f108ee513c12","Type":"ContainerStarted","Data":"2584f12832bd41b970f619f8f7be145bf4941a935e037ab259f8e8dc88eb3ea3"} Feb 17 09:20:11 crc kubenswrapper[4848]: I0217 09:20:11.336291 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-c695f" Feb 17 09:20:11 crc kubenswrapper[4848]: I0217 09:20:11.337332 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jbwkv" event={"ID":"e09d3b82-ad17-461a-89eb-b8ee45d4edff","Type":"ContainerStarted","Data":"f02ab18273a0b753ffc3d25f59d8c9732ec0920044f4c6d9f5fc681b175285ef"} Feb 17 09:20:11 crc kubenswrapper[4848]: I0217 09:20:11.338894 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"241bdede-0e36-4cfa-965b-89449d5f84f0","Type":"ContainerStarted","Data":"cf5b1c94ddd7a1c33a66e2604ad401defc6095268c38fb9589c0ca4df74d9d5d"} Feb 17 09:20:11 crc kubenswrapper[4848]: I0217 09:20:11.343997 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=8.95682276 podStartE2EDuration="17.343977202s" podCreationTimestamp="2026-02-17 09:19:54 +0000 UTC" firstStartedPulling="2026-02-17 09:20:02.431661493 +0000 UTC m=+879.974917139" lastFinishedPulling="2026-02-17 09:20:10.818815935 +0000 UTC m=+888.362071581" observedRunningTime="2026-02-17 09:20:11.34323478 +0000 UTC m=+888.886490426" watchObservedRunningTime="2026-02-17 09:20:11.343977202 +0000 UTC m=+888.887232858" Feb 17 09:20:11 crc kubenswrapper[4848]: I0217 09:20:11.378712 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-c695f" podStartSLOduration=6.237970308 podStartE2EDuration="14.378696951s" podCreationTimestamp="2026-02-17 09:19:57 +0000 UTC" firstStartedPulling="2026-02-17 09:20:02.621910382 +0000 UTC m=+880.165166028" lastFinishedPulling="2026-02-17 09:20:10.762637025 +0000 UTC m=+888.305892671" observedRunningTime="2026-02-17 09:20:11.375453406 +0000 UTC m=+888.918709052" watchObservedRunningTime="2026-02-17 09:20:11.378696951 +0000 UTC m=+888.921952587" Feb 17 09:20:11 crc kubenswrapper[4848]: I0217 09:20:11.390552 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=11.625868322 podStartE2EDuration="19.390532109s" podCreationTimestamp="2026-02-17 09:19:52 +0000 UTC" firstStartedPulling="2026-02-17 09:20:02.452583618 +0000 UTC m=+879.995839264" lastFinishedPulling="2026-02-17 09:20:10.217247405 +0000 UTC m=+887.760503051" observedRunningTime="2026-02-17 09:20:11.389102627 +0000 UTC m=+888.932358283" watchObservedRunningTime="2026-02-17 09:20:11.390532109 +0000 UTC m=+888.933787755" Feb 17 09:20:11 crc kubenswrapper[4848]: I0217 09:20:11.394875 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81f878e8-a78a-43d8-8f54-e10d5393335d" path="/var/lib/kubelet/pods/81f878e8-a78a-43d8-8f54-e10d5393335d/volumes" Feb 17 09:20:12 crc kubenswrapper[4848]: I0217 09:20:12.357229 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd7e9b9b-99f0-4720-b997-3f00996972e5","Type":"ContainerStarted","Data":"ae0120d0e18108904cbf2f561462654c7c922d6913d36d2d77ef7a061136de71"} Feb 17 09:20:12 crc kubenswrapper[4848]: I0217 09:20:12.361728 4848 generic.go:334] "Generic (PLEG): container finished" podID="e09d3b82-ad17-461a-89eb-b8ee45d4edff" containerID="f02ab18273a0b753ffc3d25f59d8c9732ec0920044f4c6d9f5fc681b175285ef" exitCode=0 Feb 17 09:20:12 crc kubenswrapper[4848]: I0217 09:20:12.361830 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jbwkv" event={"ID":"e09d3b82-ad17-461a-89eb-b8ee45d4edff","Type":"ContainerDied","Data":"f02ab18273a0b753ffc3d25f59d8c9732ec0920044f4c6d9f5fc681b175285ef"} Feb 17 09:20:12 crc kubenswrapper[4848]: I0217 09:20:12.364565 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"db50eaa9-ca0a-4a83-98d8-fce82f849d91","Type":"ContainerStarted","Data":"812daf3e743d62ecf6c5cde4e775fa48bf1f6c9b01408f073084719ba69e530f"} Feb 17 09:20:13 crc kubenswrapper[4848]: I0217 09:20:13.372987 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jbwkv" event={"ID":"e09d3b82-ad17-461a-89eb-b8ee45d4edff","Type":"ContainerStarted","Data":"9c019efc5d06113b9b874e50a2558aecd99f3706d36eae08b5ea630c912d3193"} Feb 17 09:20:13 crc kubenswrapper[4848]: I0217 09:20:13.373441 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-jbwkv" event={"ID":"e09d3b82-ad17-461a-89eb-b8ee45d4edff","Type":"ContainerStarted","Data":"a47fce8edac8b70982ec0b24f0705b69b97a691723852b4ce0aa450b4188e6a5"} Feb 17 09:20:13 crc kubenswrapper[4848]: I0217 09:20:13.373458 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-jbwkv" Feb 17 09:20:13 crc kubenswrapper[4848]: I0217 09:20:13.373471 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-jbwkv" Feb 17 09:20:13 crc kubenswrapper[4848]: I0217 09:20:13.375401 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"9ccc9c6f-4e19-464f-9e06-7a3951c63c85","Type":"ContainerStarted","Data":"e645da416eb327bf6b57b7e239c923b83f306c61fe2556f5b12526b3e568f5de"} Feb 17 09:20:13 crc kubenswrapper[4848]: I0217 09:20:13.377488 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"0c03c6cc-b85f-465f-b692-8f50eaca7cd6","Type":"ContainerStarted","Data":"bd20ef15425479deacdbc536ce7a4069a45fa9cb24b98b2e4e45ed35e751dbf8"} Feb 17 09:20:13 crc kubenswrapper[4848]: I0217 09:20:13.396735 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-jbwkv" podStartSLOduration=9.176933526 podStartE2EDuration="16.396713318s" podCreationTimestamp="2026-02-17 09:19:57 +0000 UTC" firstStartedPulling="2026-02-17 09:20:03.582548929 +0000 UTC m=+881.125804575" lastFinishedPulling="2026-02-17 09:20:10.802328721 +0000 UTC m=+888.345584367" observedRunningTime="2026-02-17 09:20:13.393584286 +0000 UTC m=+890.936839942" watchObservedRunningTime="2026-02-17 09:20:13.396713318 +0000 UTC m=+890.939968974" Feb 17 09:20:13 crc kubenswrapper[4848]: I0217 09:20:13.416087 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=3.562284423 podStartE2EDuration="13.416062486s" podCreationTimestamp="2026-02-17 09:20:00 +0000 UTC" firstStartedPulling="2026-02-17 09:20:02.623592851 +0000 UTC m=+880.166848497" lastFinishedPulling="2026-02-17 09:20:12.477370904 +0000 UTC m=+890.020626560" observedRunningTime="2026-02-17 09:20:13.410274286 +0000 UTC m=+890.953529972" watchObservedRunningTime="2026-02-17 09:20:13.416062486 +0000 UTC m=+890.959318142" Feb 17 09:20:13 crc kubenswrapper[4848]: I0217 09:20:13.440979 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=7.524562535 podStartE2EDuration="17.440958528s" podCreationTimestamp="2026-02-17 09:19:56 +0000 UTC" firstStartedPulling="2026-02-17 09:20:02.566311048 +0000 UTC m=+880.109566694" lastFinishedPulling="2026-02-17 09:20:12.482707031 +0000 UTC m=+890.025962687" observedRunningTime="2026-02-17 09:20:13.436001452 +0000 UTC m=+890.979257148" watchObservedRunningTime="2026-02-17 09:20:13.440958528 +0000 UTC m=+890.984214184" Feb 17 09:20:13 crc kubenswrapper[4848]: I0217 09:20:13.692223 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:15 crc kubenswrapper[4848]: I0217 09:20:15.395071 4848 generic.go:334] "Generic (PLEG): container finished" podID="241bdede-0e36-4cfa-965b-89449d5f84f0" containerID="cf5b1c94ddd7a1c33a66e2604ad401defc6095268c38fb9589c0ca4df74d9d5d" exitCode=0 Feb 17 09:20:15 crc kubenswrapper[4848]: I0217 09:20:15.397472 4848 generic.go:334] "Generic (PLEG): container finished" podID="ac51f8f5-cf36-44ef-b849-9bd6265e5156" containerID="0c492a1e86b27f36b464427560457cd1a38339d7c50f0734b01ba27cb955c771" exitCode=0 Feb 17 09:20:15 crc kubenswrapper[4848]: I0217 09:20:15.398951 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"241bdede-0e36-4cfa-965b-89449d5f84f0","Type":"ContainerDied","Data":"cf5b1c94ddd7a1c33a66e2604ad401defc6095268c38fb9589c0ca4df74d9d5d"} Feb 17 09:20:15 crc kubenswrapper[4848]: I0217 09:20:15.399022 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ac51f8f5-cf36-44ef-b849-9bd6265e5156","Type":"ContainerDied","Data":"0c492a1e86b27f36b464427560457cd1a38339d7c50f0734b01ba27cb955c771"} Feb 17 09:20:15 crc kubenswrapper[4848]: I0217 09:20:15.992578 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 17 09:20:16 crc kubenswrapper[4848]: I0217 09:20:16.037931 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 17 09:20:16 crc kubenswrapper[4848]: I0217 09:20:16.411819 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"241bdede-0e36-4cfa-965b-89449d5f84f0","Type":"ContainerStarted","Data":"c53442e3c5f9254760257c369d55eaa5bc7f29dfa284439cff586e6446153a46"} Feb 17 09:20:16 crc kubenswrapper[4848]: I0217 09:20:16.414734 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ac51f8f5-cf36-44ef-b849-9bd6265e5156","Type":"ContainerStarted","Data":"49ad791559d89623c58af824f48079af5deb3c235f85c951af63a2ec361d5c5f"} Feb 17 09:20:16 crc kubenswrapper[4848]: I0217 09:20:16.415053 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 17 09:20:16 crc kubenswrapper[4848]: I0217 09:20:16.450894 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=18.569018742 podStartE2EDuration="26.450871742s" podCreationTimestamp="2026-02-17 09:19:50 +0000 UTC" firstStartedPulling="2026-02-17 09:20:02.347236613 +0000 UTC m=+879.890492259" lastFinishedPulling="2026-02-17 09:20:10.229089613 +0000 UTC m=+887.772345259" observedRunningTime="2026-02-17 09:20:16.441483896 +0000 UTC m=+893.984739642" watchObservedRunningTime="2026-02-17 09:20:16.450871742 +0000 UTC m=+893.994127388" Feb 17 09:20:16 crc kubenswrapper[4848]: I0217 09:20:16.485292 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 17 09:20:16 crc kubenswrapper[4848]: I0217 09:20:16.523402 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=19.130346417 podStartE2EDuration="27.523377652s" podCreationTimestamp="2026-02-17 09:19:49 +0000 UTC" firstStartedPulling="2026-02-17 09:20:02.414254222 +0000 UTC m=+879.957509868" lastFinishedPulling="2026-02-17 09:20:10.807285457 +0000 UTC m=+888.350541103" observedRunningTime="2026-02-17 09:20:16.478119053 +0000 UTC m=+894.021374769" watchObservedRunningTime="2026-02-17 09:20:16.523377652 +0000 UTC m=+894.066633298" Feb 17 09:20:16 crc kubenswrapper[4848]: I0217 09:20:16.717860 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:16 crc kubenswrapper[4848]: I0217 09:20:16.767917 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:16 crc kubenswrapper[4848]: I0217 09:20:16.824083 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64f7f48db9-z4bxb"] Feb 17 09:20:16 crc kubenswrapper[4848]: E0217 09:20:16.824443 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81f878e8-a78a-43d8-8f54-e10d5393335d" containerName="dnsmasq-dns" Feb 17 09:20:16 crc kubenswrapper[4848]: I0217 09:20:16.824465 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="81f878e8-a78a-43d8-8f54-e10d5393335d" containerName="dnsmasq-dns" Feb 17 09:20:16 crc kubenswrapper[4848]: E0217 09:20:16.824486 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81f878e8-a78a-43d8-8f54-e10d5393335d" containerName="init" Feb 17 09:20:16 crc kubenswrapper[4848]: I0217 09:20:16.824494 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="81f878e8-a78a-43d8-8f54-e10d5393335d" containerName="init" Feb 17 09:20:16 crc kubenswrapper[4848]: I0217 09:20:16.824675 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="81f878e8-a78a-43d8-8f54-e10d5393335d" containerName="dnsmasq-dns" Feb 17 09:20:16 crc kubenswrapper[4848]: I0217 09:20:16.826913 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64f7f48db9-z4bxb" Feb 17 09:20:16 crc kubenswrapper[4848]: I0217 09:20:16.829143 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64f7f48db9-z4bxb"] Feb 17 09:20:16 crc kubenswrapper[4848]: I0217 09:20:16.831871 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 17 09:20:16 crc kubenswrapper[4848]: I0217 09:20:16.890399 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-28cvn"] Feb 17 09:20:16 crc kubenswrapper[4848]: I0217 09:20:16.893410 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-28cvn" Feb 17 09:20:16 crc kubenswrapper[4848]: I0217 09:20:16.895588 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 17 09:20:16 crc kubenswrapper[4848]: I0217 09:20:16.899381 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-28cvn"] Feb 17 09:20:16 crc kubenswrapper[4848]: I0217 09:20:16.924674 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5-dns-svc\") pod \"dnsmasq-dns-64f7f48db9-z4bxb\" (UID: \"5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5\") " pod="openstack/dnsmasq-dns-64f7f48db9-z4bxb" Feb 17 09:20:16 crc kubenswrapper[4848]: I0217 09:20:16.924797 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5-ovsdbserver-nb\") pod \"dnsmasq-dns-64f7f48db9-z4bxb\" (UID: \"5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5\") " pod="openstack/dnsmasq-dns-64f7f48db9-z4bxb" Feb 17 09:20:16 crc kubenswrapper[4848]: I0217 09:20:16.924843 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5-config\") pod \"dnsmasq-dns-64f7f48db9-z4bxb\" (UID: \"5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5\") " pod="openstack/dnsmasq-dns-64f7f48db9-z4bxb" Feb 17 09:20:16 crc kubenswrapper[4848]: I0217 09:20:16.924878 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w2x6\" (UniqueName: \"kubernetes.io/projected/5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5-kube-api-access-6w2x6\") pod \"dnsmasq-dns-64f7f48db9-z4bxb\" (UID: \"5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5\") " pod="openstack/dnsmasq-dns-64f7f48db9-z4bxb" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.026372 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4ng2\" (UniqueName: \"kubernetes.io/projected/1f7ecdca-433f-4bcc-a3d5-e433a8db3bad-kube-api-access-s4ng2\") pod \"ovn-controller-metrics-28cvn\" (UID: \"1f7ecdca-433f-4bcc-a3d5-e433a8db3bad\") " pod="openstack/ovn-controller-metrics-28cvn" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.026494 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5-ovsdbserver-nb\") pod \"dnsmasq-dns-64f7f48db9-z4bxb\" (UID: \"5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5\") " pod="openstack/dnsmasq-dns-64f7f48db9-z4bxb" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.026524 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f7ecdca-433f-4bcc-a3d5-e433a8db3bad-config\") pod \"ovn-controller-metrics-28cvn\" (UID: \"1f7ecdca-433f-4bcc-a3d5-e433a8db3bad\") " pod="openstack/ovn-controller-metrics-28cvn" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.026552 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f7ecdca-433f-4bcc-a3d5-e433a8db3bad-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-28cvn\" (UID: \"1f7ecdca-433f-4bcc-a3d5-e433a8db3bad\") " pod="openstack/ovn-controller-metrics-28cvn" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.026589 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5-config\") pod \"dnsmasq-dns-64f7f48db9-z4bxb\" (UID: \"5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5\") " pod="openstack/dnsmasq-dns-64f7f48db9-z4bxb" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.026610 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1f7ecdca-433f-4bcc-a3d5-e433a8db3bad-ovn-rundir\") pod \"ovn-controller-metrics-28cvn\" (UID: \"1f7ecdca-433f-4bcc-a3d5-e433a8db3bad\") " pod="openstack/ovn-controller-metrics-28cvn" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.026669 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w2x6\" (UniqueName: \"kubernetes.io/projected/5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5-kube-api-access-6w2x6\") pod \"dnsmasq-dns-64f7f48db9-z4bxb\" (UID: \"5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5\") " pod="openstack/dnsmasq-dns-64f7f48db9-z4bxb" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.026737 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f7ecdca-433f-4bcc-a3d5-e433a8db3bad-combined-ca-bundle\") pod \"ovn-controller-metrics-28cvn\" (UID: \"1f7ecdca-433f-4bcc-a3d5-e433a8db3bad\") " pod="openstack/ovn-controller-metrics-28cvn" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.026794 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1f7ecdca-433f-4bcc-a3d5-e433a8db3bad-ovs-rundir\") pod \"ovn-controller-metrics-28cvn\" (UID: \"1f7ecdca-433f-4bcc-a3d5-e433a8db3bad\") " pod="openstack/ovn-controller-metrics-28cvn" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.026854 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5-dns-svc\") pod \"dnsmasq-dns-64f7f48db9-z4bxb\" (UID: \"5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5\") " pod="openstack/dnsmasq-dns-64f7f48db9-z4bxb" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.027555 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5-dns-svc\") pod \"dnsmasq-dns-64f7f48db9-z4bxb\" (UID: \"5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5\") " pod="openstack/dnsmasq-dns-64f7f48db9-z4bxb" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.027577 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5-config\") pod \"dnsmasq-dns-64f7f48db9-z4bxb\" (UID: \"5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5\") " pod="openstack/dnsmasq-dns-64f7f48db9-z4bxb" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.027586 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5-ovsdbserver-nb\") pod \"dnsmasq-dns-64f7f48db9-z4bxb\" (UID: \"5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5\") " pod="openstack/dnsmasq-dns-64f7f48db9-z4bxb" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.047547 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w2x6\" (UniqueName: \"kubernetes.io/projected/5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5-kube-api-access-6w2x6\") pod \"dnsmasq-dns-64f7f48db9-z4bxb\" (UID: \"5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5\") " pod="openstack/dnsmasq-dns-64f7f48db9-z4bxb" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.130263 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f7ecdca-433f-4bcc-a3d5-e433a8db3bad-config\") pod \"ovn-controller-metrics-28cvn\" (UID: \"1f7ecdca-433f-4bcc-a3d5-e433a8db3bad\") " pod="openstack/ovn-controller-metrics-28cvn" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.130340 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f7ecdca-433f-4bcc-a3d5-e433a8db3bad-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-28cvn\" (UID: \"1f7ecdca-433f-4bcc-a3d5-e433a8db3bad\") " pod="openstack/ovn-controller-metrics-28cvn" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.130381 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1f7ecdca-433f-4bcc-a3d5-e433a8db3bad-ovn-rundir\") pod \"ovn-controller-metrics-28cvn\" (UID: \"1f7ecdca-433f-4bcc-a3d5-e433a8db3bad\") " pod="openstack/ovn-controller-metrics-28cvn" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.130432 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f7ecdca-433f-4bcc-a3d5-e433a8db3bad-combined-ca-bundle\") pod \"ovn-controller-metrics-28cvn\" (UID: \"1f7ecdca-433f-4bcc-a3d5-e433a8db3bad\") " pod="openstack/ovn-controller-metrics-28cvn" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.130500 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1f7ecdca-433f-4bcc-a3d5-e433a8db3bad-ovs-rundir\") pod \"ovn-controller-metrics-28cvn\" (UID: \"1f7ecdca-433f-4bcc-a3d5-e433a8db3bad\") " pod="openstack/ovn-controller-metrics-28cvn" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.130645 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4ng2\" (UniqueName: \"kubernetes.io/projected/1f7ecdca-433f-4bcc-a3d5-e433a8db3bad-kube-api-access-s4ng2\") pod \"ovn-controller-metrics-28cvn\" (UID: \"1f7ecdca-433f-4bcc-a3d5-e433a8db3bad\") " pod="openstack/ovn-controller-metrics-28cvn" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.130991 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f7ecdca-433f-4bcc-a3d5-e433a8db3bad-config\") pod \"ovn-controller-metrics-28cvn\" (UID: \"1f7ecdca-433f-4bcc-a3d5-e433a8db3bad\") " pod="openstack/ovn-controller-metrics-28cvn" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.131569 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/1f7ecdca-433f-4bcc-a3d5-e433a8db3bad-ovn-rundir\") pod \"ovn-controller-metrics-28cvn\" (UID: \"1f7ecdca-433f-4bcc-a3d5-e433a8db3bad\") " pod="openstack/ovn-controller-metrics-28cvn" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.131577 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/1f7ecdca-433f-4bcc-a3d5-e433a8db3bad-ovs-rundir\") pod \"ovn-controller-metrics-28cvn\" (UID: \"1f7ecdca-433f-4bcc-a3d5-e433a8db3bad\") " pod="openstack/ovn-controller-metrics-28cvn" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.135022 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f7ecdca-433f-4bcc-a3d5-e433a8db3bad-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-28cvn\" (UID: \"1f7ecdca-433f-4bcc-a3d5-e433a8db3bad\") " pod="openstack/ovn-controller-metrics-28cvn" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.135726 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f7ecdca-433f-4bcc-a3d5-e433a8db3bad-combined-ca-bundle\") pod \"ovn-controller-metrics-28cvn\" (UID: \"1f7ecdca-433f-4bcc-a3d5-e433a8db3bad\") " pod="openstack/ovn-controller-metrics-28cvn" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.145494 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4ng2\" (UniqueName: \"kubernetes.io/projected/1f7ecdca-433f-4bcc-a3d5-e433a8db3bad-kube-api-access-s4ng2\") pod \"ovn-controller-metrics-28cvn\" (UID: \"1f7ecdca-433f-4bcc-a3d5-e433a8db3bad\") " pod="openstack/ovn-controller-metrics-28cvn" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.160231 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64f7f48db9-z4bxb" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.206418 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64f7f48db9-z4bxb"] Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.209492 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-28cvn" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.232674 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df986d9c-49grd"] Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.234012 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df986d9c-49grd" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.237119 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.244343 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df986d9c-49grd"] Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.334597 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10cd4659-7654-4771-b759-7258b806c6c7-ovsdbserver-nb\") pod \"dnsmasq-dns-56df986d9c-49grd\" (UID: \"10cd4659-7654-4771-b759-7258b806c6c7\") " pod="openstack/dnsmasq-dns-56df986d9c-49grd" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.334660 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10cd4659-7654-4771-b759-7258b806c6c7-dns-svc\") pod \"dnsmasq-dns-56df986d9c-49grd\" (UID: \"10cd4659-7654-4771-b759-7258b806c6c7\") " pod="openstack/dnsmasq-dns-56df986d9c-49grd" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.334693 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6kbh\" (UniqueName: \"kubernetes.io/projected/10cd4659-7654-4771-b759-7258b806c6c7-kube-api-access-s6kbh\") pod \"dnsmasq-dns-56df986d9c-49grd\" (UID: \"10cd4659-7654-4771-b759-7258b806c6c7\") " pod="openstack/dnsmasq-dns-56df986d9c-49grd" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.334741 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10cd4659-7654-4771-b759-7258b806c6c7-ovsdbserver-sb\") pod \"dnsmasq-dns-56df986d9c-49grd\" (UID: \"10cd4659-7654-4771-b759-7258b806c6c7\") " pod="openstack/dnsmasq-dns-56df986d9c-49grd" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.334785 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10cd4659-7654-4771-b759-7258b806c6c7-config\") pod \"dnsmasq-dns-56df986d9c-49grd\" (UID: \"10cd4659-7654-4771-b759-7258b806c6c7\") " pod="openstack/dnsmasq-dns-56df986d9c-49grd" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.358928 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.436890 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10cd4659-7654-4771-b759-7258b806c6c7-ovsdbserver-nb\") pod \"dnsmasq-dns-56df986d9c-49grd\" (UID: \"10cd4659-7654-4771-b759-7258b806c6c7\") " pod="openstack/dnsmasq-dns-56df986d9c-49grd" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.436978 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10cd4659-7654-4771-b759-7258b806c6c7-dns-svc\") pod \"dnsmasq-dns-56df986d9c-49grd\" (UID: \"10cd4659-7654-4771-b759-7258b806c6c7\") " pod="openstack/dnsmasq-dns-56df986d9c-49grd" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.437037 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6kbh\" (UniqueName: \"kubernetes.io/projected/10cd4659-7654-4771-b759-7258b806c6c7-kube-api-access-s6kbh\") pod \"dnsmasq-dns-56df986d9c-49grd\" (UID: \"10cd4659-7654-4771-b759-7258b806c6c7\") " pod="openstack/dnsmasq-dns-56df986d9c-49grd" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.437113 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10cd4659-7654-4771-b759-7258b806c6c7-ovsdbserver-sb\") pod \"dnsmasq-dns-56df986d9c-49grd\" (UID: \"10cd4659-7654-4771-b759-7258b806c6c7\") " pod="openstack/dnsmasq-dns-56df986d9c-49grd" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.437167 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10cd4659-7654-4771-b759-7258b806c6c7-config\") pod \"dnsmasq-dns-56df986d9c-49grd\" (UID: \"10cd4659-7654-4771-b759-7258b806c6c7\") " pod="openstack/dnsmasq-dns-56df986d9c-49grd" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.438097 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10cd4659-7654-4771-b759-7258b806c6c7-dns-svc\") pod \"dnsmasq-dns-56df986d9c-49grd\" (UID: \"10cd4659-7654-4771-b759-7258b806c6c7\") " pod="openstack/dnsmasq-dns-56df986d9c-49grd" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.438139 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10cd4659-7654-4771-b759-7258b806c6c7-config\") pod \"dnsmasq-dns-56df986d9c-49grd\" (UID: \"10cd4659-7654-4771-b759-7258b806c6c7\") " pod="openstack/dnsmasq-dns-56df986d9c-49grd" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.438308 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10cd4659-7654-4771-b759-7258b806c6c7-ovsdbserver-nb\") pod \"dnsmasq-dns-56df986d9c-49grd\" (UID: \"10cd4659-7654-4771-b759-7258b806c6c7\") " pod="openstack/dnsmasq-dns-56df986d9c-49grd" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.438435 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10cd4659-7654-4771-b759-7258b806c6c7-ovsdbserver-sb\") pod \"dnsmasq-dns-56df986d9c-49grd\" (UID: \"10cd4659-7654-4771-b759-7258b806c6c7\") " pod="openstack/dnsmasq-dns-56df986d9c-49grd" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.455917 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6kbh\" (UniqueName: \"kubernetes.io/projected/10cd4659-7654-4771-b759-7258b806c6c7-kube-api-access-s6kbh\") pod \"dnsmasq-dns-56df986d9c-49grd\" (UID: \"10cd4659-7654-4771-b759-7258b806c6c7\") " pod="openstack/dnsmasq-dns-56df986d9c-49grd" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.472019 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.564222 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df986d9c-49grd" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.686491 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64f7f48db9-z4bxb"] Feb 17 09:20:17 crc kubenswrapper[4848]: W0217 09:20:17.806901 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f7ecdca_433f_4bcc_a3d5_e433a8db3bad.slice/crio-36c52325052a5488b185d00a199cadb6f5b5216ccc4214a6c5236e5b11c643e8 WatchSource:0}: Error finding container 36c52325052a5488b185d00a199cadb6f5b5216ccc4214a6c5236e5b11c643e8: Status 404 returned error can't find the container with id 36c52325052a5488b185d00a199cadb6f5b5216ccc4214a6c5236e5b11c643e8 Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.817109 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-28cvn"] Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.830848 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df986d9c-49grd"] Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.960484 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.961863 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.968212 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.968626 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-6xd4f" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.968998 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.969223 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 17 09:20:17 crc kubenswrapper[4848]: I0217 09:20:17.983536 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.050267 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5263f1c0-e02a-4383-ae9f-3b223486a59e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5263f1c0-e02a-4383-ae9f-3b223486a59e\") " pod="openstack/ovn-northd-0" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.050319 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5263f1c0-e02a-4383-ae9f-3b223486a59e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5263f1c0-e02a-4383-ae9f-3b223486a59e\") " pod="openstack/ovn-northd-0" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.050354 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5263f1c0-e02a-4383-ae9f-3b223486a59e-scripts\") pod \"ovn-northd-0\" (UID: \"5263f1c0-e02a-4383-ae9f-3b223486a59e\") " pod="openstack/ovn-northd-0" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.050387 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5263f1c0-e02a-4383-ae9f-3b223486a59e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5263f1c0-e02a-4383-ae9f-3b223486a59e\") " pod="openstack/ovn-northd-0" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.050418 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddj48\" (UniqueName: \"kubernetes.io/projected/5263f1c0-e02a-4383-ae9f-3b223486a59e-kube-api-access-ddj48\") pod \"ovn-northd-0\" (UID: \"5263f1c0-e02a-4383-ae9f-3b223486a59e\") " pod="openstack/ovn-northd-0" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.050452 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5263f1c0-e02a-4383-ae9f-3b223486a59e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5263f1c0-e02a-4383-ae9f-3b223486a59e\") " pod="openstack/ovn-northd-0" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.050513 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5263f1c0-e02a-4383-ae9f-3b223486a59e-config\") pod \"ovn-northd-0\" (UID: \"5263f1c0-e02a-4383-ae9f-3b223486a59e\") " pod="openstack/ovn-northd-0" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.151845 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5263f1c0-e02a-4383-ae9f-3b223486a59e-config\") pod \"ovn-northd-0\" (UID: \"5263f1c0-e02a-4383-ae9f-3b223486a59e\") " pod="openstack/ovn-northd-0" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.152310 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5263f1c0-e02a-4383-ae9f-3b223486a59e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5263f1c0-e02a-4383-ae9f-3b223486a59e\") " pod="openstack/ovn-northd-0" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.152357 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5263f1c0-e02a-4383-ae9f-3b223486a59e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5263f1c0-e02a-4383-ae9f-3b223486a59e\") " pod="openstack/ovn-northd-0" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.152399 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5263f1c0-e02a-4383-ae9f-3b223486a59e-scripts\") pod \"ovn-northd-0\" (UID: \"5263f1c0-e02a-4383-ae9f-3b223486a59e\") " pod="openstack/ovn-northd-0" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.152439 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5263f1c0-e02a-4383-ae9f-3b223486a59e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5263f1c0-e02a-4383-ae9f-3b223486a59e\") " pod="openstack/ovn-northd-0" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.152481 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddj48\" (UniqueName: \"kubernetes.io/projected/5263f1c0-e02a-4383-ae9f-3b223486a59e-kube-api-access-ddj48\") pod \"ovn-northd-0\" (UID: \"5263f1c0-e02a-4383-ae9f-3b223486a59e\") " pod="openstack/ovn-northd-0" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.152522 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5263f1c0-e02a-4383-ae9f-3b223486a59e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5263f1c0-e02a-4383-ae9f-3b223486a59e\") " pod="openstack/ovn-northd-0" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.153379 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5263f1c0-e02a-4383-ae9f-3b223486a59e-config\") pod \"ovn-northd-0\" (UID: \"5263f1c0-e02a-4383-ae9f-3b223486a59e\") " pod="openstack/ovn-northd-0" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.154247 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5263f1c0-e02a-4383-ae9f-3b223486a59e-scripts\") pod \"ovn-northd-0\" (UID: \"5263f1c0-e02a-4383-ae9f-3b223486a59e\") " pod="openstack/ovn-northd-0" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.156017 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5263f1c0-e02a-4383-ae9f-3b223486a59e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5263f1c0-e02a-4383-ae9f-3b223486a59e\") " pod="openstack/ovn-northd-0" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.158161 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5263f1c0-e02a-4383-ae9f-3b223486a59e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5263f1c0-e02a-4383-ae9f-3b223486a59e\") " pod="openstack/ovn-northd-0" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.158359 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5263f1c0-e02a-4383-ae9f-3b223486a59e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5263f1c0-e02a-4383-ae9f-3b223486a59e\") " pod="openstack/ovn-northd-0" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.158951 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5263f1c0-e02a-4383-ae9f-3b223486a59e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5263f1c0-e02a-4383-ae9f-3b223486a59e\") " pod="openstack/ovn-northd-0" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.174415 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddj48\" (UniqueName: \"kubernetes.io/projected/5263f1c0-e02a-4383-ae9f-3b223486a59e-kube-api-access-ddj48\") pod \"ovn-northd-0\" (UID: \"5263f1c0-e02a-4383-ae9f-3b223486a59e\") " pod="openstack/ovn-northd-0" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.335059 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.435898 4848 generic.go:334] "Generic (PLEG): container finished" podID="10cd4659-7654-4771-b759-7258b806c6c7" containerID="512569e029a50acbab7f17abddb5f9e9f456a217acbd13697778de61e94e528d" exitCode=0 Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.435995 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df986d9c-49grd" event={"ID":"10cd4659-7654-4771-b759-7258b806c6c7","Type":"ContainerDied","Data":"512569e029a50acbab7f17abddb5f9e9f456a217acbd13697778de61e94e528d"} Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.436033 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df986d9c-49grd" event={"ID":"10cd4659-7654-4771-b759-7258b806c6c7","Type":"ContainerStarted","Data":"eb0ef4bce6bfac126c8d67debdde8c63f08f4be92c3271f9b181e11bdee911fb"} Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.438403 4848 generic.go:334] "Generic (PLEG): container finished" podID="5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5" containerID="dec0f238f1022a9361e8e8e292ebc4f1da3fcf36d38e0a091d1af99417f7b9e9" exitCode=0 Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.438550 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64f7f48db9-z4bxb" event={"ID":"5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5","Type":"ContainerDied","Data":"dec0f238f1022a9361e8e8e292ebc4f1da3fcf36d38e0a091d1af99417f7b9e9"} Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.438600 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64f7f48db9-z4bxb" event={"ID":"5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5","Type":"ContainerStarted","Data":"e3a47258f848a72eb70c963f70df03b04b837785b57613bd94a3c0921e8a5fdd"} Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.445720 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-28cvn" event={"ID":"1f7ecdca-433f-4bcc-a3d5-e433a8db3bad","Type":"ContainerStarted","Data":"c5a904f9f038e1fc2d73718dc61d884a4ef86cf9012aead989fe32e3f22bf215"} Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.445831 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-28cvn" event={"ID":"1f7ecdca-433f-4bcc-a3d5-e433a8db3bad","Type":"ContainerStarted","Data":"36c52325052a5488b185d00a199cadb6f5b5216ccc4214a6c5236e5b11c643e8"} Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.554942 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-28cvn" podStartSLOduration=2.5549191860000002 podStartE2EDuration="2.554919186s" podCreationTimestamp="2026-02-17 09:20:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:20:18.541892423 +0000 UTC m=+896.085148069" watchObservedRunningTime="2026-02-17 09:20:18.554919186 +0000 UTC m=+896.098174832" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.849466 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 17 09:20:18 crc kubenswrapper[4848]: W0217 09:20:18.852672 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5263f1c0_e02a_4383_ae9f_3b223486a59e.slice/crio-69e69b14e9c067e10bfcdbc4f82e2ebe477ff8e0f69137c99ddc7fd6f3fc412f WatchSource:0}: Error finding container 69e69b14e9c067e10bfcdbc4f82e2ebe477ff8e0f69137c99ddc7fd6f3fc412f: Status 404 returned error can't find the container with id 69e69b14e9c067e10bfcdbc4f82e2ebe477ff8e0f69137c99ddc7fd6f3fc412f Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.854104 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64f7f48db9-z4bxb" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.967846 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5-dns-svc\") pod \"5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5\" (UID: \"5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5\") " Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.968161 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5-config\") pod \"5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5\" (UID: \"5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5\") " Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.968203 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6w2x6\" (UniqueName: \"kubernetes.io/projected/5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5-kube-api-access-6w2x6\") pod \"5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5\" (UID: \"5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5\") " Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.968266 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5-ovsdbserver-nb\") pod \"5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5\" (UID: \"5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5\") " Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.972484 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5-kube-api-access-6w2x6" (OuterVolumeSpecName: "kube-api-access-6w2x6") pod "5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5" (UID: "5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5"). InnerVolumeSpecName "kube-api-access-6w2x6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.985397 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5" (UID: "5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.986334 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5-config" (OuterVolumeSpecName: "config") pod "5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5" (UID: "5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:18 crc kubenswrapper[4848]: I0217 09:20:18.994382 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5" (UID: "5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:19 crc kubenswrapper[4848]: I0217 09:20:19.072009 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:19 crc kubenswrapper[4848]: I0217 09:20:19.072044 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:19 crc kubenswrapper[4848]: I0217 09:20:19.072064 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6w2x6\" (UniqueName: \"kubernetes.io/projected/5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5-kube-api-access-6w2x6\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:19 crc kubenswrapper[4848]: I0217 09:20:19.072079 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:19 crc kubenswrapper[4848]: I0217 09:20:19.451731 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64f7f48db9-z4bxb" event={"ID":"5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5","Type":"ContainerDied","Data":"e3a47258f848a72eb70c963f70df03b04b837785b57613bd94a3c0921e8a5fdd"} Feb 17 09:20:19 crc kubenswrapper[4848]: I0217 09:20:19.451756 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64f7f48db9-z4bxb" Feb 17 09:20:19 crc kubenswrapper[4848]: I0217 09:20:19.451799 4848 scope.go:117] "RemoveContainer" containerID="dec0f238f1022a9361e8e8e292ebc4f1da3fcf36d38e0a091d1af99417f7b9e9" Feb 17 09:20:19 crc kubenswrapper[4848]: I0217 09:20:19.454683 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5263f1c0-e02a-4383-ae9f-3b223486a59e","Type":"ContainerStarted","Data":"69e69b14e9c067e10bfcdbc4f82e2ebe477ff8e0f69137c99ddc7fd6f3fc412f"} Feb 17 09:20:19 crc kubenswrapper[4848]: I0217 09:20:19.525230 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64f7f48db9-z4bxb"] Feb 17 09:20:19 crc kubenswrapper[4848]: I0217 09:20:19.531327 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64f7f48db9-z4bxb"] Feb 17 09:20:20 crc kubenswrapper[4848]: I0217 09:20:20.753439 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 17 09:20:20 crc kubenswrapper[4848]: I0217 09:20:20.753507 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 17 09:20:21 crc kubenswrapper[4848]: I0217 09:20:21.399008 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5" path="/var/lib/kubelet/pods/5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5/volumes" Feb 17 09:20:22 crc kubenswrapper[4848]: I0217 09:20:22.044645 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 17 09:20:22 crc kubenswrapper[4848]: I0217 09:20:22.045022 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 17 09:20:24 crc kubenswrapper[4848]: I0217 09:20:24.500399 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df986d9c-49grd" event={"ID":"10cd4659-7654-4771-b759-7258b806c6c7","Type":"ContainerStarted","Data":"c3aaa347f699be9b07507a783e915cde9e8981c54279f435d1e1575c66a2ee83"} Feb 17 09:20:24 crc kubenswrapper[4848]: I0217 09:20:24.500784 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df986d9c-49grd" Feb 17 09:20:24 crc kubenswrapper[4848]: I0217 09:20:24.530867 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df986d9c-49grd" podStartSLOduration=7.530846081 podStartE2EDuration="7.530846081s" podCreationTimestamp="2026-02-17 09:20:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:20:24.528820452 +0000 UTC m=+902.072076108" watchObservedRunningTime="2026-02-17 09:20:24.530846081 +0000 UTC m=+902.074101737" Feb 17 09:20:24 crc kubenswrapper[4848]: I0217 09:20:24.570343 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 17 09:20:24 crc kubenswrapper[4848]: I0217 09:20:24.644523 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 17 09:20:24 crc kubenswrapper[4848]: I0217 09:20:24.960948 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 17 09:20:25 crc kubenswrapper[4848]: I0217 09:20:25.021649 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df986d9c-49grd"] Feb 17 09:20:25 crc kubenswrapper[4848]: I0217 09:20:25.063340 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-66b577f8c-nxk7q"] Feb 17 09:20:25 crc kubenswrapper[4848]: E0217 09:20:25.064108 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5" containerName="init" Feb 17 09:20:25 crc kubenswrapper[4848]: I0217 09:20:25.064128 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5" containerName="init" Feb 17 09:20:25 crc kubenswrapper[4848]: I0217 09:20:25.064332 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d053a3b-c66a-49fd-8fb5-0e5aa1e458d5" containerName="init" Feb 17 09:20:25 crc kubenswrapper[4848]: I0217 09:20:25.065423 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" Feb 17 09:20:25 crc kubenswrapper[4848]: I0217 09:20:25.101573 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66b577f8c-nxk7q"] Feb 17 09:20:25 crc kubenswrapper[4848]: I0217 09:20:25.183559 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfbaba75-c3cd-4281-903a-1e77c7409afc-dns-svc\") pod \"dnsmasq-dns-66b577f8c-nxk7q\" (UID: \"cfbaba75-c3cd-4281-903a-1e77c7409afc\") " pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" Feb 17 09:20:25 crc kubenswrapper[4848]: I0217 09:20:25.183597 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfbaba75-c3cd-4281-903a-1e77c7409afc-ovsdbserver-sb\") pod \"dnsmasq-dns-66b577f8c-nxk7q\" (UID: \"cfbaba75-c3cd-4281-903a-1e77c7409afc\") " pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" Feb 17 09:20:25 crc kubenswrapper[4848]: I0217 09:20:25.183697 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfbaba75-c3cd-4281-903a-1e77c7409afc-ovsdbserver-nb\") pod \"dnsmasq-dns-66b577f8c-nxk7q\" (UID: \"cfbaba75-c3cd-4281-903a-1e77c7409afc\") " pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" Feb 17 09:20:25 crc kubenswrapper[4848]: I0217 09:20:25.183801 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfbaba75-c3cd-4281-903a-1e77c7409afc-config\") pod \"dnsmasq-dns-66b577f8c-nxk7q\" (UID: \"cfbaba75-c3cd-4281-903a-1e77c7409afc\") " pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" Feb 17 09:20:25 crc kubenswrapper[4848]: I0217 09:20:25.183859 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98zkj\" (UniqueName: \"kubernetes.io/projected/cfbaba75-c3cd-4281-903a-1e77c7409afc-kube-api-access-98zkj\") pod \"dnsmasq-dns-66b577f8c-nxk7q\" (UID: \"cfbaba75-c3cd-4281-903a-1e77c7409afc\") " pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" Feb 17 09:20:25 crc kubenswrapper[4848]: I0217 09:20:25.285580 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfbaba75-c3cd-4281-903a-1e77c7409afc-config\") pod \"dnsmasq-dns-66b577f8c-nxk7q\" (UID: \"cfbaba75-c3cd-4281-903a-1e77c7409afc\") " pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" Feb 17 09:20:25 crc kubenswrapper[4848]: I0217 09:20:25.285637 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98zkj\" (UniqueName: \"kubernetes.io/projected/cfbaba75-c3cd-4281-903a-1e77c7409afc-kube-api-access-98zkj\") pod \"dnsmasq-dns-66b577f8c-nxk7q\" (UID: \"cfbaba75-c3cd-4281-903a-1e77c7409afc\") " pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" Feb 17 09:20:25 crc kubenswrapper[4848]: I0217 09:20:25.285680 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfbaba75-c3cd-4281-903a-1e77c7409afc-dns-svc\") pod \"dnsmasq-dns-66b577f8c-nxk7q\" (UID: \"cfbaba75-c3cd-4281-903a-1e77c7409afc\") " pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" Feb 17 09:20:25 crc kubenswrapper[4848]: I0217 09:20:25.285699 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfbaba75-c3cd-4281-903a-1e77c7409afc-ovsdbserver-sb\") pod \"dnsmasq-dns-66b577f8c-nxk7q\" (UID: \"cfbaba75-c3cd-4281-903a-1e77c7409afc\") " pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" Feb 17 09:20:25 crc kubenswrapper[4848]: I0217 09:20:25.285754 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfbaba75-c3cd-4281-903a-1e77c7409afc-ovsdbserver-nb\") pod \"dnsmasq-dns-66b577f8c-nxk7q\" (UID: \"cfbaba75-c3cd-4281-903a-1e77c7409afc\") " pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" Feb 17 09:20:25 crc kubenswrapper[4848]: I0217 09:20:25.286509 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfbaba75-c3cd-4281-903a-1e77c7409afc-ovsdbserver-nb\") pod \"dnsmasq-dns-66b577f8c-nxk7q\" (UID: \"cfbaba75-c3cd-4281-903a-1e77c7409afc\") " pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" Feb 17 09:20:25 crc kubenswrapper[4848]: I0217 09:20:25.287017 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfbaba75-c3cd-4281-903a-1e77c7409afc-config\") pod \"dnsmasq-dns-66b577f8c-nxk7q\" (UID: \"cfbaba75-c3cd-4281-903a-1e77c7409afc\") " pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" Feb 17 09:20:25 crc kubenswrapper[4848]: I0217 09:20:25.287668 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfbaba75-c3cd-4281-903a-1e77c7409afc-ovsdbserver-sb\") pod \"dnsmasq-dns-66b577f8c-nxk7q\" (UID: \"cfbaba75-c3cd-4281-903a-1e77c7409afc\") " pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" Feb 17 09:20:25 crc kubenswrapper[4848]: I0217 09:20:25.287668 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfbaba75-c3cd-4281-903a-1e77c7409afc-dns-svc\") pod \"dnsmasq-dns-66b577f8c-nxk7q\" (UID: \"cfbaba75-c3cd-4281-903a-1e77c7409afc\") " pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" Feb 17 09:20:25 crc kubenswrapper[4848]: I0217 09:20:25.305748 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98zkj\" (UniqueName: \"kubernetes.io/projected/cfbaba75-c3cd-4281-903a-1e77c7409afc-kube-api-access-98zkj\") pod \"dnsmasq-dns-66b577f8c-nxk7q\" (UID: \"cfbaba75-c3cd-4281-903a-1e77c7409afc\") " pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" Feb 17 09:20:25 crc kubenswrapper[4848]: I0217 09:20:25.404050 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" Feb 17 09:20:25 crc kubenswrapper[4848]: I0217 09:20:25.880984 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-66b577f8c-nxk7q"] Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.165218 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.171130 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.172929 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.173206 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.173691 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.173753 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-mktkv" Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.195403 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.314016 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5bc15802-6def-48fe-8fd5-e6d85d068827-lock\") pod \"swift-storage-0\" (UID: \"5bc15802-6def-48fe-8fd5-e6d85d068827\") " pod="openstack/swift-storage-0" Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.314069 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5bc15802-6def-48fe-8fd5-e6d85d068827-cache\") pod \"swift-storage-0\" (UID: \"5bc15802-6def-48fe-8fd5-e6d85d068827\") " pod="openstack/swift-storage-0" Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.314267 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bc15802-6def-48fe-8fd5-e6d85d068827-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"5bc15802-6def-48fe-8fd5-e6d85d068827\") " pod="openstack/swift-storage-0" Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.314421 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fzmc\" (UniqueName: \"kubernetes.io/projected/5bc15802-6def-48fe-8fd5-e6d85d068827-kube-api-access-4fzmc\") pod \"swift-storage-0\" (UID: \"5bc15802-6def-48fe-8fd5-e6d85d068827\") " pod="openstack/swift-storage-0" Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.314462 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5bc15802-6def-48fe-8fd5-e6d85d068827-etc-swift\") pod \"swift-storage-0\" (UID: \"5bc15802-6def-48fe-8fd5-e6d85d068827\") " pod="openstack/swift-storage-0" Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.314485 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"5bc15802-6def-48fe-8fd5-e6d85d068827\") " pod="openstack/swift-storage-0" Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.416610 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5bc15802-6def-48fe-8fd5-e6d85d068827-lock\") pod \"swift-storage-0\" (UID: \"5bc15802-6def-48fe-8fd5-e6d85d068827\") " pod="openstack/swift-storage-0" Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.416918 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5bc15802-6def-48fe-8fd5-e6d85d068827-cache\") pod \"swift-storage-0\" (UID: \"5bc15802-6def-48fe-8fd5-e6d85d068827\") " pod="openstack/swift-storage-0" Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.416988 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bc15802-6def-48fe-8fd5-e6d85d068827-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"5bc15802-6def-48fe-8fd5-e6d85d068827\") " pod="openstack/swift-storage-0" Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.417047 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fzmc\" (UniqueName: \"kubernetes.io/projected/5bc15802-6def-48fe-8fd5-e6d85d068827-kube-api-access-4fzmc\") pod \"swift-storage-0\" (UID: \"5bc15802-6def-48fe-8fd5-e6d85d068827\") " pod="openstack/swift-storage-0" Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.417081 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5bc15802-6def-48fe-8fd5-e6d85d068827-etc-swift\") pod \"swift-storage-0\" (UID: \"5bc15802-6def-48fe-8fd5-e6d85d068827\") " pod="openstack/swift-storage-0" Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.417112 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"5bc15802-6def-48fe-8fd5-e6d85d068827\") " pod="openstack/swift-storage-0" Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.417177 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/5bc15802-6def-48fe-8fd5-e6d85d068827-lock\") pod \"swift-storage-0\" (UID: \"5bc15802-6def-48fe-8fd5-e6d85d068827\") " pod="openstack/swift-storage-0" Feb 17 09:20:26 crc kubenswrapper[4848]: E0217 09:20:26.417280 4848 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 09:20:26 crc kubenswrapper[4848]: E0217 09:20:26.417310 4848 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.417353 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/5bc15802-6def-48fe-8fd5-e6d85d068827-cache\") pod \"swift-storage-0\" (UID: \"5bc15802-6def-48fe-8fd5-e6d85d068827\") " pod="openstack/swift-storage-0" Feb 17 09:20:26 crc kubenswrapper[4848]: E0217 09:20:26.417384 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5bc15802-6def-48fe-8fd5-e6d85d068827-etc-swift podName:5bc15802-6def-48fe-8fd5-e6d85d068827 nodeName:}" failed. No retries permitted until 2026-02-17 09:20:26.917358776 +0000 UTC m=+904.460614432 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5bc15802-6def-48fe-8fd5-e6d85d068827-etc-swift") pod "swift-storage-0" (UID: "5bc15802-6def-48fe-8fd5-e6d85d068827") : configmap "swift-ring-files" not found Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.417389 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"5bc15802-6def-48fe-8fd5-e6d85d068827\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/swift-storage-0" Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.422829 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bc15802-6def-48fe-8fd5-e6d85d068827-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"5bc15802-6def-48fe-8fd5-e6d85d068827\") " pod="openstack/swift-storage-0" Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.450276 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fzmc\" (UniqueName: \"kubernetes.io/projected/5bc15802-6def-48fe-8fd5-e6d85d068827-kube-api-access-4fzmc\") pod \"swift-storage-0\" (UID: \"5bc15802-6def-48fe-8fd5-e6d85d068827\") " pod="openstack/swift-storage-0" Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.451783 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"5bc15802-6def-48fe-8fd5-e6d85d068827\") " pod="openstack/swift-storage-0" Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.501240 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.519199 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5263f1c0-e02a-4383-ae9f-3b223486a59e","Type":"ContainerStarted","Data":"cc3be432c1e53029c7d5f71e7484229cc7a7f4ab180025277fab7470854f3f48"} Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.519245 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5263f1c0-e02a-4383-ae9f-3b223486a59e","Type":"ContainerStarted","Data":"2fe0d9fdac49073f3be047418f1b103b0d8e43a320742724f337b8ed4b755d24"} Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.519402 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.522823 4848 generic.go:334] "Generic (PLEG): container finished" podID="cfbaba75-c3cd-4281-903a-1e77c7409afc" containerID="0e6f8ff0d254a1c12057a1fc0d095620b587f7452d72c795827ad107faabf2a5" exitCode=0 Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.523091 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df986d9c-49grd" podUID="10cd4659-7654-4771-b759-7258b806c6c7" containerName="dnsmasq-dns" containerID="cri-o://c3aaa347f699be9b07507a783e915cde9e8981c54279f435d1e1575c66a2ee83" gracePeriod=10 Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.523126 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" event={"ID":"cfbaba75-c3cd-4281-903a-1e77c7409afc","Type":"ContainerDied","Data":"0e6f8ff0d254a1c12057a1fc0d095620b587f7452d72c795827ad107faabf2a5"} Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.523167 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" event={"ID":"cfbaba75-c3cd-4281-903a-1e77c7409afc","Type":"ContainerStarted","Data":"bce38fa0d9306a714c4cadd92b11b9c9f14ff2dde397ee386e66a22ea610033d"} Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.553826 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.9714706360000003 podStartE2EDuration="9.553802624s" podCreationTimestamp="2026-02-17 09:20:17 +0000 UTC" firstStartedPulling="2026-02-17 09:20:18.855873586 +0000 UTC m=+896.399129222" lastFinishedPulling="2026-02-17 09:20:25.438205564 +0000 UTC m=+902.981461210" observedRunningTime="2026-02-17 09:20:26.55230563 +0000 UTC m=+904.095561286" watchObservedRunningTime="2026-02-17 09:20:26.553802624 +0000 UTC m=+904.097058290" Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.622044 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.923599 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5bc15802-6def-48fe-8fd5-e6d85d068827-etc-swift\") pod \"swift-storage-0\" (UID: \"5bc15802-6def-48fe-8fd5-e6d85d068827\") " pod="openstack/swift-storage-0" Feb 17 09:20:26 crc kubenswrapper[4848]: E0217 09:20:26.923844 4848 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 09:20:26 crc kubenswrapper[4848]: E0217 09:20:26.923879 4848 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 09:20:26 crc kubenswrapper[4848]: E0217 09:20:26.923974 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5bc15802-6def-48fe-8fd5-e6d85d068827-etc-swift podName:5bc15802-6def-48fe-8fd5-e6d85d068827 nodeName:}" failed. No retries permitted until 2026-02-17 09:20:27.923941756 +0000 UTC m=+905.467197452 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5bc15802-6def-48fe-8fd5-e6d85d068827-etc-swift") pod "swift-storage-0" (UID: "5bc15802-6def-48fe-8fd5-e6d85d068827") : configmap "swift-ring-files" not found Feb 17 09:20:26 crc kubenswrapper[4848]: I0217 09:20:26.952463 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df986d9c-49grd" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.025045 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10cd4659-7654-4771-b759-7258b806c6c7-ovsdbserver-nb\") pod \"10cd4659-7654-4771-b759-7258b806c6c7\" (UID: \"10cd4659-7654-4771-b759-7258b806c6c7\") " Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.025145 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10cd4659-7654-4771-b759-7258b806c6c7-config\") pod \"10cd4659-7654-4771-b759-7258b806c6c7\" (UID: \"10cd4659-7654-4771-b759-7258b806c6c7\") " Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.025200 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6kbh\" (UniqueName: \"kubernetes.io/projected/10cd4659-7654-4771-b759-7258b806c6c7-kube-api-access-s6kbh\") pod \"10cd4659-7654-4771-b759-7258b806c6c7\" (UID: \"10cd4659-7654-4771-b759-7258b806c6c7\") " Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.025245 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10cd4659-7654-4771-b759-7258b806c6c7-ovsdbserver-sb\") pod \"10cd4659-7654-4771-b759-7258b806c6c7\" (UID: \"10cd4659-7654-4771-b759-7258b806c6c7\") " Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.025279 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10cd4659-7654-4771-b759-7258b806c6c7-dns-svc\") pod \"10cd4659-7654-4771-b759-7258b806c6c7\" (UID: \"10cd4659-7654-4771-b759-7258b806c6c7\") " Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.040959 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10cd4659-7654-4771-b759-7258b806c6c7-kube-api-access-s6kbh" (OuterVolumeSpecName: "kube-api-access-s6kbh") pod "10cd4659-7654-4771-b759-7258b806c6c7" (UID: "10cd4659-7654-4771-b759-7258b806c6c7"). InnerVolumeSpecName "kube-api-access-s6kbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.079150 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10cd4659-7654-4771-b759-7258b806c6c7-config" (OuterVolumeSpecName: "config") pod "10cd4659-7654-4771-b759-7258b806c6c7" (UID: "10cd4659-7654-4771-b759-7258b806c6c7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.083147 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10cd4659-7654-4771-b759-7258b806c6c7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "10cd4659-7654-4771-b759-7258b806c6c7" (UID: "10cd4659-7654-4771-b759-7258b806c6c7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.088474 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10cd4659-7654-4771-b759-7258b806c6c7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "10cd4659-7654-4771-b759-7258b806c6c7" (UID: "10cd4659-7654-4771-b759-7258b806c6c7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.100424 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10cd4659-7654-4771-b759-7258b806c6c7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "10cd4659-7654-4771-b759-7258b806c6c7" (UID: "10cd4659-7654-4771-b759-7258b806c6c7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.127627 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6kbh\" (UniqueName: \"kubernetes.io/projected/10cd4659-7654-4771-b759-7258b806c6c7-kube-api-access-s6kbh\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.127658 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/10cd4659-7654-4771-b759-7258b806c6c7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.127667 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/10cd4659-7654-4771-b759-7258b806c6c7-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.127675 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/10cd4659-7654-4771-b759-7258b806c6c7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.127684 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/10cd4659-7654-4771-b759-7258b806c6c7-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.533907 4848 generic.go:334] "Generic (PLEG): container finished" podID="10cd4659-7654-4771-b759-7258b806c6c7" containerID="c3aaa347f699be9b07507a783e915cde9e8981c54279f435d1e1575c66a2ee83" exitCode=0 Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.534004 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df986d9c-49grd" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.533976 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df986d9c-49grd" event={"ID":"10cd4659-7654-4771-b759-7258b806c6c7","Type":"ContainerDied","Data":"c3aaa347f699be9b07507a783e915cde9e8981c54279f435d1e1575c66a2ee83"} Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.534142 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df986d9c-49grd" event={"ID":"10cd4659-7654-4771-b759-7258b806c6c7","Type":"ContainerDied","Data":"eb0ef4bce6bfac126c8d67debdde8c63f08f4be92c3271f9b181e11bdee911fb"} Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.534188 4848 scope.go:117] "RemoveContainer" containerID="c3aaa347f699be9b07507a783e915cde9e8981c54279f435d1e1575c66a2ee83" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.537844 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" event={"ID":"cfbaba75-c3cd-4281-903a-1e77c7409afc","Type":"ContainerStarted","Data":"13e36b3a73c7897c0c9292939e1849490e5d428526341cb6d5d4106ef651477b"} Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.538688 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.559006 4848 scope.go:117] "RemoveContainer" containerID="512569e029a50acbab7f17abddb5f9e9f456a217acbd13697778de61e94e528d" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.587100 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-6ee8-account-create-update-w4zgx"] Feb 17 09:20:27 crc kubenswrapper[4848]: E0217 09:20:27.587598 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10cd4659-7654-4771-b759-7258b806c6c7" containerName="dnsmasq-dns" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.587612 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="10cd4659-7654-4771-b759-7258b806c6c7" containerName="dnsmasq-dns" Feb 17 09:20:27 crc kubenswrapper[4848]: E0217 09:20:27.587654 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10cd4659-7654-4771-b759-7258b806c6c7" containerName="init" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.587660 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="10cd4659-7654-4771-b759-7258b806c6c7" containerName="init" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.587835 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="10cd4659-7654-4771-b759-7258b806c6c7" containerName="dnsmasq-dns" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.588351 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6ee8-account-create-update-w4zgx" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.589960 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.597216 4848 scope.go:117] "RemoveContainer" containerID="c3aaa347f699be9b07507a783e915cde9e8981c54279f435d1e1575c66a2ee83" Feb 17 09:20:27 crc kubenswrapper[4848]: E0217 09:20:27.597684 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3aaa347f699be9b07507a783e915cde9e8981c54279f435d1e1575c66a2ee83\": container with ID starting with c3aaa347f699be9b07507a783e915cde9e8981c54279f435d1e1575c66a2ee83 not found: ID does not exist" containerID="c3aaa347f699be9b07507a783e915cde9e8981c54279f435d1e1575c66a2ee83" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.597715 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3aaa347f699be9b07507a783e915cde9e8981c54279f435d1e1575c66a2ee83"} err="failed to get container status \"c3aaa347f699be9b07507a783e915cde9e8981c54279f435d1e1575c66a2ee83\": rpc error: code = NotFound desc = could not find container \"c3aaa347f699be9b07507a783e915cde9e8981c54279f435d1e1575c66a2ee83\": container with ID starting with c3aaa347f699be9b07507a783e915cde9e8981c54279f435d1e1575c66a2ee83 not found: ID does not exist" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.597735 4848 scope.go:117] "RemoveContainer" containerID="512569e029a50acbab7f17abddb5f9e9f456a217acbd13697778de61e94e528d" Feb 17 09:20:27 crc kubenswrapper[4848]: E0217 09:20:27.598319 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"512569e029a50acbab7f17abddb5f9e9f456a217acbd13697778de61e94e528d\": container with ID starting with 512569e029a50acbab7f17abddb5f9e9f456a217acbd13697778de61e94e528d not found: ID does not exist" containerID="512569e029a50acbab7f17abddb5f9e9f456a217acbd13697778de61e94e528d" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.598346 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"512569e029a50acbab7f17abddb5f9e9f456a217acbd13697778de61e94e528d"} err="failed to get container status \"512569e029a50acbab7f17abddb5f9e9f456a217acbd13697778de61e94e528d\": rpc error: code = NotFound desc = could not find container \"512569e029a50acbab7f17abddb5f9e9f456a217acbd13697778de61e94e528d\": container with ID starting with 512569e029a50acbab7f17abddb5f9e9f456a217acbd13697778de61e94e528d not found: ID does not exist" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.600682 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" podStartSLOduration=2.600659634 podStartE2EDuration="2.600659634s" podCreationTimestamp="2026-02-17 09:20:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:20:27.572905889 +0000 UTC m=+905.116161535" watchObservedRunningTime="2026-02-17 09:20:27.600659634 +0000 UTC m=+905.143915290" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.614458 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6ee8-account-create-update-w4zgx"] Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.625199 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-p7czj"] Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.626440 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df986d9c-49grd"] Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.626539 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-p7czj" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.631695 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df986d9c-49grd"] Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.641305 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-p7czj"] Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.740546 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbe13273-ef85-4246-99bf-cb85a278c25d-operator-scripts\") pod \"glance-db-create-p7czj\" (UID: \"bbe13273-ef85-4246-99bf-cb85a278c25d\") " pod="openstack/glance-db-create-p7czj" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.740591 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a0b47d8-84ea-4b51-a171-cb713122e873-operator-scripts\") pod \"glance-6ee8-account-create-update-w4zgx\" (UID: \"2a0b47d8-84ea-4b51-a171-cb713122e873\") " pod="openstack/glance-6ee8-account-create-update-w4zgx" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.740693 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s79v\" (UniqueName: \"kubernetes.io/projected/2a0b47d8-84ea-4b51-a171-cb713122e873-kube-api-access-9s79v\") pod \"glance-6ee8-account-create-update-w4zgx\" (UID: \"2a0b47d8-84ea-4b51-a171-cb713122e873\") " pod="openstack/glance-6ee8-account-create-update-w4zgx" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.740830 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhvg8\" (UniqueName: \"kubernetes.io/projected/bbe13273-ef85-4246-99bf-cb85a278c25d-kube-api-access-xhvg8\") pod \"glance-db-create-p7czj\" (UID: \"bbe13273-ef85-4246-99bf-cb85a278c25d\") " pod="openstack/glance-db-create-p7czj" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.843040 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbe13273-ef85-4246-99bf-cb85a278c25d-operator-scripts\") pod \"glance-db-create-p7czj\" (UID: \"bbe13273-ef85-4246-99bf-cb85a278c25d\") " pod="openstack/glance-db-create-p7czj" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.843134 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a0b47d8-84ea-4b51-a171-cb713122e873-operator-scripts\") pod \"glance-6ee8-account-create-update-w4zgx\" (UID: \"2a0b47d8-84ea-4b51-a171-cb713122e873\") " pod="openstack/glance-6ee8-account-create-update-w4zgx" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.843174 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s79v\" (UniqueName: \"kubernetes.io/projected/2a0b47d8-84ea-4b51-a171-cb713122e873-kube-api-access-9s79v\") pod \"glance-6ee8-account-create-update-w4zgx\" (UID: \"2a0b47d8-84ea-4b51-a171-cb713122e873\") " pod="openstack/glance-6ee8-account-create-update-w4zgx" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.843251 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhvg8\" (UniqueName: \"kubernetes.io/projected/bbe13273-ef85-4246-99bf-cb85a278c25d-kube-api-access-xhvg8\") pod \"glance-db-create-p7czj\" (UID: \"bbe13273-ef85-4246-99bf-cb85a278c25d\") " pod="openstack/glance-db-create-p7czj" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.843851 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbe13273-ef85-4246-99bf-cb85a278c25d-operator-scripts\") pod \"glance-db-create-p7czj\" (UID: \"bbe13273-ef85-4246-99bf-cb85a278c25d\") " pod="openstack/glance-db-create-p7czj" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.843851 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a0b47d8-84ea-4b51-a171-cb713122e873-operator-scripts\") pod \"glance-6ee8-account-create-update-w4zgx\" (UID: \"2a0b47d8-84ea-4b51-a171-cb713122e873\") " pod="openstack/glance-6ee8-account-create-update-w4zgx" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.869359 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhvg8\" (UniqueName: \"kubernetes.io/projected/bbe13273-ef85-4246-99bf-cb85a278c25d-kube-api-access-xhvg8\") pod \"glance-db-create-p7czj\" (UID: \"bbe13273-ef85-4246-99bf-cb85a278c25d\") " pod="openstack/glance-db-create-p7czj" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.869563 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s79v\" (UniqueName: \"kubernetes.io/projected/2a0b47d8-84ea-4b51-a171-cb713122e873-kube-api-access-9s79v\") pod \"glance-6ee8-account-create-update-w4zgx\" (UID: \"2a0b47d8-84ea-4b51-a171-cb713122e873\") " pod="openstack/glance-6ee8-account-create-update-w4zgx" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.944665 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5bc15802-6def-48fe-8fd5-e6d85d068827-etc-swift\") pod \"swift-storage-0\" (UID: \"5bc15802-6def-48fe-8fd5-e6d85d068827\") " pod="openstack/swift-storage-0" Feb 17 09:20:27 crc kubenswrapper[4848]: E0217 09:20:27.944858 4848 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 09:20:27 crc kubenswrapper[4848]: E0217 09:20:27.944878 4848 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 09:20:27 crc kubenswrapper[4848]: E0217 09:20:27.944929 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5bc15802-6def-48fe-8fd5-e6d85d068827-etc-swift podName:5bc15802-6def-48fe-8fd5-e6d85d068827 nodeName:}" failed. No retries permitted until 2026-02-17 09:20:29.944910185 +0000 UTC m=+907.488165831 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5bc15802-6def-48fe-8fd5-e6d85d068827-etc-swift") pod "swift-storage-0" (UID: "5bc15802-6def-48fe-8fd5-e6d85d068827") : configmap "swift-ring-files" not found Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.949135 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6ee8-account-create-update-w4zgx" Feb 17 09:20:27 crc kubenswrapper[4848]: I0217 09:20:27.959925 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-p7czj" Feb 17 09:20:28 crc kubenswrapper[4848]: I0217 09:20:28.409975 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6ee8-account-create-update-w4zgx"] Feb 17 09:20:28 crc kubenswrapper[4848]: W0217 09:20:28.418043 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a0b47d8_84ea_4b51_a171_cb713122e873.slice/crio-a686adee2cf1459d8d68b0ce6c08db14eb95ccfcacedfb74766e16da935b49d8 WatchSource:0}: Error finding container a686adee2cf1459d8d68b0ce6c08db14eb95ccfcacedfb74766e16da935b49d8: Status 404 returned error can't find the container with id a686adee2cf1459d8d68b0ce6c08db14eb95ccfcacedfb74766e16da935b49d8 Feb 17 09:20:28 crc kubenswrapper[4848]: I0217 09:20:28.501773 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-p7czj"] Feb 17 09:20:28 crc kubenswrapper[4848]: W0217 09:20:28.515620 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbbe13273_ef85_4246_99bf_cb85a278c25d.slice/crio-d1f16c04917b455cb00adebac75486d708c6ce28a21a62de6f12555d7fbe87f8 WatchSource:0}: Error finding container d1f16c04917b455cb00adebac75486d708c6ce28a21a62de6f12555d7fbe87f8: Status 404 returned error can't find the container with id d1f16c04917b455cb00adebac75486d708c6ce28a21a62de6f12555d7fbe87f8 Feb 17 09:20:28 crc kubenswrapper[4848]: I0217 09:20:28.544891 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-p7czj" event={"ID":"bbe13273-ef85-4246-99bf-cb85a278c25d","Type":"ContainerStarted","Data":"d1f16c04917b455cb00adebac75486d708c6ce28a21a62de6f12555d7fbe87f8"} Feb 17 09:20:28 crc kubenswrapper[4848]: I0217 09:20:28.546060 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6ee8-account-create-update-w4zgx" event={"ID":"2a0b47d8-84ea-4b51-a171-cb713122e873","Type":"ContainerStarted","Data":"a686adee2cf1459d8d68b0ce6c08db14eb95ccfcacedfb74766e16da935b49d8"} Feb 17 09:20:29 crc kubenswrapper[4848]: I0217 09:20:29.359733 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-96bhn"] Feb 17 09:20:29 crc kubenswrapper[4848]: I0217 09:20:29.360722 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-96bhn" Feb 17 09:20:29 crc kubenswrapper[4848]: I0217 09:20:29.363597 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 17 09:20:29 crc kubenswrapper[4848]: I0217 09:20:29.376901 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-96bhn"] Feb 17 09:20:29 crc kubenswrapper[4848]: I0217 09:20:29.400994 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10cd4659-7654-4771-b759-7258b806c6c7" path="/var/lib/kubelet/pods/10cd4659-7654-4771-b759-7258b806c6c7/volumes" Feb 17 09:20:29 crc kubenswrapper[4848]: I0217 09:20:29.468981 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ffa0ee3-7860-4df2-9a5a-dd960a4851cd-operator-scripts\") pod \"root-account-create-update-96bhn\" (UID: \"7ffa0ee3-7860-4df2-9a5a-dd960a4851cd\") " pod="openstack/root-account-create-update-96bhn" Feb 17 09:20:29 crc kubenswrapper[4848]: I0217 09:20:29.469048 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hssjk\" (UniqueName: \"kubernetes.io/projected/7ffa0ee3-7860-4df2-9a5a-dd960a4851cd-kube-api-access-hssjk\") pod \"root-account-create-update-96bhn\" (UID: \"7ffa0ee3-7860-4df2-9a5a-dd960a4851cd\") " pod="openstack/root-account-create-update-96bhn" Feb 17 09:20:29 crc kubenswrapper[4848]: I0217 09:20:29.555745 4848 generic.go:334] "Generic (PLEG): container finished" podID="bbe13273-ef85-4246-99bf-cb85a278c25d" containerID="ce96dc17b10dd0adbf3d1534437111e4b5801c23ffe9f7d1bf9a7d9b75233630" exitCode=0 Feb 17 09:20:29 crc kubenswrapper[4848]: I0217 09:20:29.555818 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-p7czj" event={"ID":"bbe13273-ef85-4246-99bf-cb85a278c25d","Type":"ContainerDied","Data":"ce96dc17b10dd0adbf3d1534437111e4b5801c23ffe9f7d1bf9a7d9b75233630"} Feb 17 09:20:29 crc kubenswrapper[4848]: I0217 09:20:29.557579 4848 generic.go:334] "Generic (PLEG): container finished" podID="2a0b47d8-84ea-4b51-a171-cb713122e873" containerID="a692e8999ccbddeb59800c874722b8e348dcb571dd3d19c772f852c0ea2c8425" exitCode=0 Feb 17 09:20:29 crc kubenswrapper[4848]: I0217 09:20:29.557618 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6ee8-account-create-update-w4zgx" event={"ID":"2a0b47d8-84ea-4b51-a171-cb713122e873","Type":"ContainerDied","Data":"a692e8999ccbddeb59800c874722b8e348dcb571dd3d19c772f852c0ea2c8425"} Feb 17 09:20:29 crc kubenswrapper[4848]: I0217 09:20:29.570185 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ffa0ee3-7860-4df2-9a5a-dd960a4851cd-operator-scripts\") pod \"root-account-create-update-96bhn\" (UID: \"7ffa0ee3-7860-4df2-9a5a-dd960a4851cd\") " pod="openstack/root-account-create-update-96bhn" Feb 17 09:20:29 crc kubenswrapper[4848]: I0217 09:20:29.570262 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hssjk\" (UniqueName: \"kubernetes.io/projected/7ffa0ee3-7860-4df2-9a5a-dd960a4851cd-kube-api-access-hssjk\") pod \"root-account-create-update-96bhn\" (UID: \"7ffa0ee3-7860-4df2-9a5a-dd960a4851cd\") " pod="openstack/root-account-create-update-96bhn" Feb 17 09:20:29 crc kubenswrapper[4848]: I0217 09:20:29.570984 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ffa0ee3-7860-4df2-9a5a-dd960a4851cd-operator-scripts\") pod \"root-account-create-update-96bhn\" (UID: \"7ffa0ee3-7860-4df2-9a5a-dd960a4851cd\") " pod="openstack/root-account-create-update-96bhn" Feb 17 09:20:29 crc kubenswrapper[4848]: I0217 09:20:29.616631 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hssjk\" (UniqueName: \"kubernetes.io/projected/7ffa0ee3-7860-4df2-9a5a-dd960a4851cd-kube-api-access-hssjk\") pod \"root-account-create-update-96bhn\" (UID: \"7ffa0ee3-7860-4df2-9a5a-dd960a4851cd\") " pod="openstack/root-account-create-update-96bhn" Feb 17 09:20:29 crc kubenswrapper[4848]: I0217 09:20:29.677683 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-96bhn" Feb 17 09:20:29 crc kubenswrapper[4848]: I0217 09:20:29.977142 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5bc15802-6def-48fe-8fd5-e6d85d068827-etc-swift\") pod \"swift-storage-0\" (UID: \"5bc15802-6def-48fe-8fd5-e6d85d068827\") " pod="openstack/swift-storage-0" Feb 17 09:20:29 crc kubenswrapper[4848]: E0217 09:20:29.977617 4848 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 09:20:29 crc kubenswrapper[4848]: E0217 09:20:29.977633 4848 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 09:20:29 crc kubenswrapper[4848]: E0217 09:20:29.977682 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5bc15802-6def-48fe-8fd5-e6d85d068827-etc-swift podName:5bc15802-6def-48fe-8fd5-e6d85d068827 nodeName:}" failed. No retries permitted until 2026-02-17 09:20:33.977667346 +0000 UTC m=+911.520922992 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5bc15802-6def-48fe-8fd5-e6d85d068827-etc-swift") pod "swift-storage-0" (UID: "5bc15802-6def-48fe-8fd5-e6d85d068827") : configmap "swift-ring-files" not found Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.086156 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-c6lrl"] Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.087155 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-c6lrl" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.089261 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.089592 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.090178 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.102115 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-c6lrl"] Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.171980 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-96bhn"] Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.180125 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a049c1c-b425-44cc-bde0-2e83be29d1a1-scripts\") pod \"swift-ring-rebalance-c6lrl\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " pod="openstack/swift-ring-rebalance-c6lrl" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.180180 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a049c1c-b425-44cc-bde0-2e83be29d1a1-combined-ca-bundle\") pod \"swift-ring-rebalance-c6lrl\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " pod="openstack/swift-ring-rebalance-c6lrl" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.180220 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntvrw\" (UniqueName: \"kubernetes.io/projected/6a049c1c-b425-44cc-bde0-2e83be29d1a1-kube-api-access-ntvrw\") pod \"swift-ring-rebalance-c6lrl\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " pod="openstack/swift-ring-rebalance-c6lrl" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.180271 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6a049c1c-b425-44cc-bde0-2e83be29d1a1-dispersionconf\") pod \"swift-ring-rebalance-c6lrl\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " pod="openstack/swift-ring-rebalance-c6lrl" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.180295 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6a049c1c-b425-44cc-bde0-2e83be29d1a1-etc-swift\") pod \"swift-ring-rebalance-c6lrl\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " pod="openstack/swift-ring-rebalance-c6lrl" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.180362 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6a049c1c-b425-44cc-bde0-2e83be29d1a1-ring-data-devices\") pod \"swift-ring-rebalance-c6lrl\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " pod="openstack/swift-ring-rebalance-c6lrl" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.180398 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6a049c1c-b425-44cc-bde0-2e83be29d1a1-swiftconf\") pod \"swift-ring-rebalance-c6lrl\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " pod="openstack/swift-ring-rebalance-c6lrl" Feb 17 09:20:30 crc kubenswrapper[4848]: W0217 09:20:30.180731 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ffa0ee3_7860_4df2_9a5a_dd960a4851cd.slice/crio-9783d205251679c949fec7f04bc881b0e5813f131b932a406fef0d7e593eccc0 WatchSource:0}: Error finding container 9783d205251679c949fec7f04bc881b0e5813f131b932a406fef0d7e593eccc0: Status 404 returned error can't find the container with id 9783d205251679c949fec7f04bc881b0e5813f131b932a406fef0d7e593eccc0 Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.251610 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h5g8g"] Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.253939 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5g8g" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.275834 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h5g8g"] Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.281638 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntvrw\" (UniqueName: \"kubernetes.io/projected/6a049c1c-b425-44cc-bde0-2e83be29d1a1-kube-api-access-ntvrw\") pod \"swift-ring-rebalance-c6lrl\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " pod="openstack/swift-ring-rebalance-c6lrl" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.281716 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6a049c1c-b425-44cc-bde0-2e83be29d1a1-dispersionconf\") pod \"swift-ring-rebalance-c6lrl\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " pod="openstack/swift-ring-rebalance-c6lrl" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.281743 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6a049c1c-b425-44cc-bde0-2e83be29d1a1-etc-swift\") pod \"swift-ring-rebalance-c6lrl\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " pod="openstack/swift-ring-rebalance-c6lrl" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.281800 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40074d0d-335c-4da0-900b-43a3cd3ea091-catalog-content\") pod \"community-operators-h5g8g\" (UID: \"40074d0d-335c-4da0-900b-43a3cd3ea091\") " pod="openshift-marketplace/community-operators-h5g8g" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.281868 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6a049c1c-b425-44cc-bde0-2e83be29d1a1-ring-data-devices\") pod \"swift-ring-rebalance-c6lrl\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " pod="openstack/swift-ring-rebalance-c6lrl" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.281900 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6a049c1c-b425-44cc-bde0-2e83be29d1a1-swiftconf\") pod \"swift-ring-rebalance-c6lrl\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " pod="openstack/swift-ring-rebalance-c6lrl" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.281957 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40074d0d-335c-4da0-900b-43a3cd3ea091-utilities\") pod \"community-operators-h5g8g\" (UID: \"40074d0d-335c-4da0-900b-43a3cd3ea091\") " pod="openshift-marketplace/community-operators-h5g8g" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.282003 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9knnc\" (UniqueName: \"kubernetes.io/projected/40074d0d-335c-4da0-900b-43a3cd3ea091-kube-api-access-9knnc\") pod \"community-operators-h5g8g\" (UID: \"40074d0d-335c-4da0-900b-43a3cd3ea091\") " pod="openshift-marketplace/community-operators-h5g8g" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.282029 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a049c1c-b425-44cc-bde0-2e83be29d1a1-scripts\") pod \"swift-ring-rebalance-c6lrl\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " pod="openstack/swift-ring-rebalance-c6lrl" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.282052 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a049c1c-b425-44cc-bde0-2e83be29d1a1-combined-ca-bundle\") pod \"swift-ring-rebalance-c6lrl\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " pod="openstack/swift-ring-rebalance-c6lrl" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.283302 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6a049c1c-b425-44cc-bde0-2e83be29d1a1-ring-data-devices\") pod \"swift-ring-rebalance-c6lrl\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " pod="openstack/swift-ring-rebalance-c6lrl" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.283629 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6a049c1c-b425-44cc-bde0-2e83be29d1a1-etc-swift\") pod \"swift-ring-rebalance-c6lrl\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " pod="openstack/swift-ring-rebalance-c6lrl" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.285006 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a049c1c-b425-44cc-bde0-2e83be29d1a1-scripts\") pod \"swift-ring-rebalance-c6lrl\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " pod="openstack/swift-ring-rebalance-c6lrl" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.289781 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6a049c1c-b425-44cc-bde0-2e83be29d1a1-swiftconf\") pod \"swift-ring-rebalance-c6lrl\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " pod="openstack/swift-ring-rebalance-c6lrl" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.297504 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a049c1c-b425-44cc-bde0-2e83be29d1a1-combined-ca-bundle\") pod \"swift-ring-rebalance-c6lrl\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " pod="openstack/swift-ring-rebalance-c6lrl" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.298554 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6a049c1c-b425-44cc-bde0-2e83be29d1a1-dispersionconf\") pod \"swift-ring-rebalance-c6lrl\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " pod="openstack/swift-ring-rebalance-c6lrl" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.312578 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntvrw\" (UniqueName: \"kubernetes.io/projected/6a049c1c-b425-44cc-bde0-2e83be29d1a1-kube-api-access-ntvrw\") pod \"swift-ring-rebalance-c6lrl\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " pod="openstack/swift-ring-rebalance-c6lrl" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.383571 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40074d0d-335c-4da0-900b-43a3cd3ea091-catalog-content\") pod \"community-operators-h5g8g\" (UID: \"40074d0d-335c-4da0-900b-43a3cd3ea091\") " pod="openshift-marketplace/community-operators-h5g8g" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.383687 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40074d0d-335c-4da0-900b-43a3cd3ea091-utilities\") pod \"community-operators-h5g8g\" (UID: \"40074d0d-335c-4da0-900b-43a3cd3ea091\") " pod="openshift-marketplace/community-operators-h5g8g" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.383736 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9knnc\" (UniqueName: \"kubernetes.io/projected/40074d0d-335c-4da0-900b-43a3cd3ea091-kube-api-access-9knnc\") pod \"community-operators-h5g8g\" (UID: \"40074d0d-335c-4da0-900b-43a3cd3ea091\") " pod="openshift-marketplace/community-operators-h5g8g" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.384269 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40074d0d-335c-4da0-900b-43a3cd3ea091-catalog-content\") pod \"community-operators-h5g8g\" (UID: \"40074d0d-335c-4da0-900b-43a3cd3ea091\") " pod="openshift-marketplace/community-operators-h5g8g" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.384465 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40074d0d-335c-4da0-900b-43a3cd3ea091-utilities\") pod \"community-operators-h5g8g\" (UID: \"40074d0d-335c-4da0-900b-43a3cd3ea091\") " pod="openshift-marketplace/community-operators-h5g8g" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.401296 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9knnc\" (UniqueName: \"kubernetes.io/projected/40074d0d-335c-4da0-900b-43a3cd3ea091-kube-api-access-9knnc\") pod \"community-operators-h5g8g\" (UID: \"40074d0d-335c-4da0-900b-43a3cd3ea091\") " pod="openshift-marketplace/community-operators-h5g8g" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.406876 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-c6lrl" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.568527 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-96bhn" event={"ID":"7ffa0ee3-7860-4df2-9a5a-dd960a4851cd","Type":"ContainerStarted","Data":"626b49679a5d5e4c9c615170a4f40a8f6453615e3c804816282db06d7d975ac2"} Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.569232 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-96bhn" event={"ID":"7ffa0ee3-7860-4df2-9a5a-dd960a4851cd","Type":"ContainerStarted","Data":"9783d205251679c949fec7f04bc881b0e5813f131b932a406fef0d7e593eccc0"} Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.582520 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-96bhn" podStartSLOduration=1.5824997490000001 podStartE2EDuration="1.582499749s" podCreationTimestamp="2026-02-17 09:20:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:20:30.58151043 +0000 UTC m=+908.124766076" watchObservedRunningTime="2026-02-17 09:20:30.582499749 +0000 UTC m=+908.125755395" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.599657 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5g8g" Feb 17 09:20:30 crc kubenswrapper[4848]: I0217 09:20:30.874906 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-c6lrl"] Feb 17 09:20:30 crc kubenswrapper[4848]: W0217 09:20:30.950631 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a049c1c_b425_44cc_bde0_2e83be29d1a1.slice/crio-d27d6afc12569fe024295ee55da2956e918b5a429825582e6d4e8b3a01b1f988 WatchSource:0}: Error finding container d27d6afc12569fe024295ee55da2956e918b5a429825582e6d4e8b3a01b1f988: Status 404 returned error can't find the container with id d27d6afc12569fe024295ee55da2956e918b5a429825582e6d4e8b3a01b1f988 Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.146702 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6ee8-account-create-update-w4zgx" Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.180916 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-p7czj" Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.203256 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a0b47d8-84ea-4b51-a171-cb713122e873-operator-scripts\") pod \"2a0b47d8-84ea-4b51-a171-cb713122e873\" (UID: \"2a0b47d8-84ea-4b51-a171-cb713122e873\") " Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.203422 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s79v\" (UniqueName: \"kubernetes.io/projected/2a0b47d8-84ea-4b51-a171-cb713122e873-kube-api-access-9s79v\") pod \"2a0b47d8-84ea-4b51-a171-cb713122e873\" (UID: \"2a0b47d8-84ea-4b51-a171-cb713122e873\") " Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.203852 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a0b47d8-84ea-4b51-a171-cb713122e873-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2a0b47d8-84ea-4b51-a171-cb713122e873" (UID: "2a0b47d8-84ea-4b51-a171-cb713122e873"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.213373 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a0b47d8-84ea-4b51-a171-cb713122e873-kube-api-access-9s79v" (OuterVolumeSpecName: "kube-api-access-9s79v") pod "2a0b47d8-84ea-4b51-a171-cb713122e873" (UID: "2a0b47d8-84ea-4b51-a171-cb713122e873"). InnerVolumeSpecName "kube-api-access-9s79v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:20:31 crc kubenswrapper[4848]: W0217 09:20:31.292318 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40074d0d_335c_4da0_900b_43a3cd3ea091.slice/crio-a67f0185733c50f6a449f0811693601e5a3f3f7190be93253d9f5aeb1741af24 WatchSource:0}: Error finding container a67f0185733c50f6a449f0811693601e5a3f3f7190be93253d9f5aeb1741af24: Status 404 returned error can't find the container with id a67f0185733c50f6a449f0811693601e5a3f3f7190be93253d9f5aeb1741af24 Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.293059 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h5g8g"] Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.304696 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhvg8\" (UniqueName: \"kubernetes.io/projected/bbe13273-ef85-4246-99bf-cb85a278c25d-kube-api-access-xhvg8\") pod \"bbe13273-ef85-4246-99bf-cb85a278c25d\" (UID: \"bbe13273-ef85-4246-99bf-cb85a278c25d\") " Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.304857 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbe13273-ef85-4246-99bf-cb85a278c25d-operator-scripts\") pod \"bbe13273-ef85-4246-99bf-cb85a278c25d\" (UID: \"bbe13273-ef85-4246-99bf-cb85a278c25d\") " Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.305239 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbe13273-ef85-4246-99bf-cb85a278c25d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bbe13273-ef85-4246-99bf-cb85a278c25d" (UID: "bbe13273-ef85-4246-99bf-cb85a278c25d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.305288 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a0b47d8-84ea-4b51-a171-cb713122e873-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.305307 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s79v\" (UniqueName: \"kubernetes.io/projected/2a0b47d8-84ea-4b51-a171-cb713122e873-kube-api-access-9s79v\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.307580 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbe13273-ef85-4246-99bf-cb85a278c25d-kube-api-access-xhvg8" (OuterVolumeSpecName: "kube-api-access-xhvg8") pod "bbe13273-ef85-4246-99bf-cb85a278c25d" (UID: "bbe13273-ef85-4246-99bf-cb85a278c25d"). InnerVolumeSpecName "kube-api-access-xhvg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.406417 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bbe13273-ef85-4246-99bf-cb85a278c25d-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.406693 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhvg8\" (UniqueName: \"kubernetes.io/projected/bbe13273-ef85-4246-99bf-cb85a278c25d-kube-api-access-xhvg8\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.578689 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-c6lrl" event={"ID":"6a049c1c-b425-44cc-bde0-2e83be29d1a1","Type":"ContainerStarted","Data":"d27d6afc12569fe024295ee55da2956e918b5a429825582e6d4e8b3a01b1f988"} Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.581332 4848 generic.go:334] "Generic (PLEG): container finished" podID="40074d0d-335c-4da0-900b-43a3cd3ea091" containerID="f9dc46de446ec6286cd38b030b0a109b35f1c9a00b79ec87222850075e84bca7" exitCode=0 Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.581382 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5g8g" event={"ID":"40074d0d-335c-4da0-900b-43a3cd3ea091","Type":"ContainerDied","Data":"f9dc46de446ec6286cd38b030b0a109b35f1c9a00b79ec87222850075e84bca7"} Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.581398 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5g8g" event={"ID":"40074d0d-335c-4da0-900b-43a3cd3ea091","Type":"ContainerStarted","Data":"a67f0185733c50f6a449f0811693601e5a3f3f7190be93253d9f5aeb1741af24"} Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.583499 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-p7czj" Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.583856 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-p7czj" event={"ID":"bbe13273-ef85-4246-99bf-cb85a278c25d","Type":"ContainerDied","Data":"d1f16c04917b455cb00adebac75486d708c6ce28a21a62de6f12555d7fbe87f8"} Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.583887 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1f16c04917b455cb00adebac75486d708c6ce28a21a62de6f12555d7fbe87f8" Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.591739 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6ee8-account-create-update-w4zgx" event={"ID":"2a0b47d8-84ea-4b51-a171-cb713122e873","Type":"ContainerDied","Data":"a686adee2cf1459d8d68b0ce6c08db14eb95ccfcacedfb74766e16da935b49d8"} Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.591800 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a686adee2cf1459d8d68b0ce6c08db14eb95ccfcacedfb74766e16da935b49d8" Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.591861 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6ee8-account-create-update-w4zgx" Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.613215 4848 generic.go:334] "Generic (PLEG): container finished" podID="7ffa0ee3-7860-4df2-9a5a-dd960a4851cd" containerID="626b49679a5d5e4c9c615170a4f40a8f6453615e3c804816282db06d7d975ac2" exitCode=0 Feb 17 09:20:31 crc kubenswrapper[4848]: I0217 09:20:31.613265 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-96bhn" event={"ID":"7ffa0ee3-7860-4df2-9a5a-dd960a4851cd","Type":"ContainerDied","Data":"626b49679a5d5e4c9c615170a4f40a8f6453615e3c804816282db06d7d975ac2"} Feb 17 09:20:32 crc kubenswrapper[4848]: I0217 09:20:32.807659 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-5f5pj"] Feb 17 09:20:32 crc kubenswrapper[4848]: E0217 09:20:32.808285 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a0b47d8-84ea-4b51-a171-cb713122e873" containerName="mariadb-account-create-update" Feb 17 09:20:32 crc kubenswrapper[4848]: I0217 09:20:32.808297 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a0b47d8-84ea-4b51-a171-cb713122e873" containerName="mariadb-account-create-update" Feb 17 09:20:32 crc kubenswrapper[4848]: E0217 09:20:32.808317 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bbe13273-ef85-4246-99bf-cb85a278c25d" containerName="mariadb-database-create" Feb 17 09:20:32 crc kubenswrapper[4848]: I0217 09:20:32.808346 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="bbe13273-ef85-4246-99bf-cb85a278c25d" containerName="mariadb-database-create" Feb 17 09:20:32 crc kubenswrapper[4848]: I0217 09:20:32.808520 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="bbe13273-ef85-4246-99bf-cb85a278c25d" containerName="mariadb-database-create" Feb 17 09:20:32 crc kubenswrapper[4848]: I0217 09:20:32.808534 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a0b47d8-84ea-4b51-a171-cb713122e873" containerName="mariadb-account-create-update" Feb 17 09:20:32 crc kubenswrapper[4848]: I0217 09:20:32.809096 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-5f5pj" Feb 17 09:20:32 crc kubenswrapper[4848]: I0217 09:20:32.816835 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-5f5pj"] Feb 17 09:20:32 crc kubenswrapper[4848]: I0217 09:20:32.820067 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 17 09:20:32 crc kubenswrapper[4848]: I0217 09:20:32.820185 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-v66vs" Feb 17 09:20:32 crc kubenswrapper[4848]: I0217 09:20:32.835114 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wcfl\" (UniqueName: \"kubernetes.io/projected/ed29dc41-db30-4792-8518-4ef61f232734-kube-api-access-4wcfl\") pod \"glance-db-sync-5f5pj\" (UID: \"ed29dc41-db30-4792-8518-4ef61f232734\") " pod="openstack/glance-db-sync-5f5pj" Feb 17 09:20:32 crc kubenswrapper[4848]: I0217 09:20:32.835204 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed29dc41-db30-4792-8518-4ef61f232734-config-data\") pod \"glance-db-sync-5f5pj\" (UID: \"ed29dc41-db30-4792-8518-4ef61f232734\") " pod="openstack/glance-db-sync-5f5pj" Feb 17 09:20:32 crc kubenswrapper[4848]: I0217 09:20:32.835281 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed29dc41-db30-4792-8518-4ef61f232734-combined-ca-bundle\") pod \"glance-db-sync-5f5pj\" (UID: \"ed29dc41-db30-4792-8518-4ef61f232734\") " pod="openstack/glance-db-sync-5f5pj" Feb 17 09:20:32 crc kubenswrapper[4848]: I0217 09:20:32.835394 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed29dc41-db30-4792-8518-4ef61f232734-db-sync-config-data\") pod \"glance-db-sync-5f5pj\" (UID: \"ed29dc41-db30-4792-8518-4ef61f232734\") " pod="openstack/glance-db-sync-5f5pj" Feb 17 09:20:32 crc kubenswrapper[4848]: I0217 09:20:32.936758 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wcfl\" (UniqueName: \"kubernetes.io/projected/ed29dc41-db30-4792-8518-4ef61f232734-kube-api-access-4wcfl\") pod \"glance-db-sync-5f5pj\" (UID: \"ed29dc41-db30-4792-8518-4ef61f232734\") " pod="openstack/glance-db-sync-5f5pj" Feb 17 09:20:32 crc kubenswrapper[4848]: I0217 09:20:32.936845 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed29dc41-db30-4792-8518-4ef61f232734-config-data\") pod \"glance-db-sync-5f5pj\" (UID: \"ed29dc41-db30-4792-8518-4ef61f232734\") " pod="openstack/glance-db-sync-5f5pj" Feb 17 09:20:32 crc kubenswrapper[4848]: I0217 09:20:32.936891 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed29dc41-db30-4792-8518-4ef61f232734-combined-ca-bundle\") pod \"glance-db-sync-5f5pj\" (UID: \"ed29dc41-db30-4792-8518-4ef61f232734\") " pod="openstack/glance-db-sync-5f5pj" Feb 17 09:20:32 crc kubenswrapper[4848]: I0217 09:20:32.936954 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed29dc41-db30-4792-8518-4ef61f232734-db-sync-config-data\") pod \"glance-db-sync-5f5pj\" (UID: \"ed29dc41-db30-4792-8518-4ef61f232734\") " pod="openstack/glance-db-sync-5f5pj" Feb 17 09:20:32 crc kubenswrapper[4848]: I0217 09:20:32.945392 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed29dc41-db30-4792-8518-4ef61f232734-db-sync-config-data\") pod \"glance-db-sync-5f5pj\" (UID: \"ed29dc41-db30-4792-8518-4ef61f232734\") " pod="openstack/glance-db-sync-5f5pj" Feb 17 09:20:32 crc kubenswrapper[4848]: I0217 09:20:32.945404 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed29dc41-db30-4792-8518-4ef61f232734-combined-ca-bundle\") pod \"glance-db-sync-5f5pj\" (UID: \"ed29dc41-db30-4792-8518-4ef61f232734\") " pod="openstack/glance-db-sync-5f5pj" Feb 17 09:20:32 crc kubenswrapper[4848]: I0217 09:20:32.945548 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed29dc41-db30-4792-8518-4ef61f232734-config-data\") pod \"glance-db-sync-5f5pj\" (UID: \"ed29dc41-db30-4792-8518-4ef61f232734\") " pod="openstack/glance-db-sync-5f5pj" Feb 17 09:20:32 crc kubenswrapper[4848]: I0217 09:20:32.952120 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wcfl\" (UniqueName: \"kubernetes.io/projected/ed29dc41-db30-4792-8518-4ef61f232734-kube-api-access-4wcfl\") pod \"glance-db-sync-5f5pj\" (UID: \"ed29dc41-db30-4792-8518-4ef61f232734\") " pod="openstack/glance-db-sync-5f5pj" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.142958 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-5f5pj" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.281288 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-96bhn" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.343304 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ffa0ee3-7860-4df2-9a5a-dd960a4851cd-operator-scripts\") pod \"7ffa0ee3-7860-4df2-9a5a-dd960a4851cd\" (UID: \"7ffa0ee3-7860-4df2-9a5a-dd960a4851cd\") " Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.343382 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hssjk\" (UniqueName: \"kubernetes.io/projected/7ffa0ee3-7860-4df2-9a5a-dd960a4851cd-kube-api-access-hssjk\") pod \"7ffa0ee3-7860-4df2-9a5a-dd960a4851cd\" (UID: \"7ffa0ee3-7860-4df2-9a5a-dd960a4851cd\") " Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.344579 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7ffa0ee3-7860-4df2-9a5a-dd960a4851cd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7ffa0ee3-7860-4df2-9a5a-dd960a4851cd" (UID: "7ffa0ee3-7860-4df2-9a5a-dd960a4851cd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.363028 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ffa0ee3-7860-4df2-9a5a-dd960a4851cd-kube-api-access-hssjk" (OuterVolumeSpecName: "kube-api-access-hssjk") pod "7ffa0ee3-7860-4df2-9a5a-dd960a4851cd" (UID: "7ffa0ee3-7860-4df2-9a5a-dd960a4851cd"). InnerVolumeSpecName "kube-api-access-hssjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.398607 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-wp62q"] Feb 17 09:20:33 crc kubenswrapper[4848]: E0217 09:20:33.400175 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ffa0ee3-7860-4df2-9a5a-dd960a4851cd" containerName="mariadb-account-create-update" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.400215 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ffa0ee3-7860-4df2-9a5a-dd960a4851cd" containerName="mariadb-account-create-update" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.400407 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ffa0ee3-7860-4df2-9a5a-dd960a4851cd" containerName="mariadb-account-create-update" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.400973 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wp62q" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.411933 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-wp62q"] Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.445615 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e60dd67a-6939-45a6-97f1-4f2e54bc4ca7-operator-scripts\") pod \"keystone-db-create-wp62q\" (UID: \"e60dd67a-6939-45a6-97f1-4f2e54bc4ca7\") " pod="openstack/keystone-db-create-wp62q" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.445886 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plqh7\" (UniqueName: \"kubernetes.io/projected/e60dd67a-6939-45a6-97f1-4f2e54bc4ca7-kube-api-access-plqh7\") pod \"keystone-db-create-wp62q\" (UID: \"e60dd67a-6939-45a6-97f1-4f2e54bc4ca7\") " pod="openstack/keystone-db-create-wp62q" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.446137 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7ffa0ee3-7860-4df2-9a5a-dd960a4851cd-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.446158 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hssjk\" (UniqueName: \"kubernetes.io/projected/7ffa0ee3-7860-4df2-9a5a-dd960a4851cd-kube-api-access-hssjk\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.548175 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e60dd67a-6939-45a6-97f1-4f2e54bc4ca7-operator-scripts\") pod \"keystone-db-create-wp62q\" (UID: \"e60dd67a-6939-45a6-97f1-4f2e54bc4ca7\") " pod="openstack/keystone-db-create-wp62q" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.548265 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plqh7\" (UniqueName: \"kubernetes.io/projected/e60dd67a-6939-45a6-97f1-4f2e54bc4ca7-kube-api-access-plqh7\") pod \"keystone-db-create-wp62q\" (UID: \"e60dd67a-6939-45a6-97f1-4f2e54bc4ca7\") " pod="openstack/keystone-db-create-wp62q" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.549359 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e60dd67a-6939-45a6-97f1-4f2e54bc4ca7-operator-scripts\") pod \"keystone-db-create-wp62q\" (UID: \"e60dd67a-6939-45a6-97f1-4f2e54bc4ca7\") " pod="openstack/keystone-db-create-wp62q" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.566013 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plqh7\" (UniqueName: \"kubernetes.io/projected/e60dd67a-6939-45a6-97f1-4f2e54bc4ca7-kube-api-access-plqh7\") pod \"keystone-db-create-wp62q\" (UID: \"e60dd67a-6939-45a6-97f1-4f2e54bc4ca7\") " pod="openstack/keystone-db-create-wp62q" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.598486 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-e78a-account-create-update-rbzcs"] Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.599872 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e78a-account-create-update-rbzcs" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.602770 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.605120 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-w29sg"] Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.606209 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-w29sg" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.617497 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-w29sg"] Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.633498 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-96bhn" event={"ID":"7ffa0ee3-7860-4df2-9a5a-dd960a4851cd","Type":"ContainerDied","Data":"9783d205251679c949fec7f04bc881b0e5813f131b932a406fef0d7e593eccc0"} Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.633532 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9783d205251679c949fec7f04bc881b0e5813f131b932a406fef0d7e593eccc0" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.633560 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-96bhn" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.646154 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-e78a-account-create-update-rbzcs"] Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.649437 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z5fp\" (UniqueName: \"kubernetes.io/projected/1009b7c4-b07d-4be4-87a2-d82f1286dfc9-kube-api-access-2z5fp\") pod \"keystone-e78a-account-create-update-rbzcs\" (UID: \"1009b7c4-b07d-4be4-87a2-d82f1286dfc9\") " pod="openstack/keystone-e78a-account-create-update-rbzcs" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.649645 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/858540f4-3108-4527-bf4f-d163b4f2c66f-operator-scripts\") pod \"placement-db-create-w29sg\" (UID: \"858540f4-3108-4527-bf4f-d163b4f2c66f\") " pod="openstack/placement-db-create-w29sg" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.649716 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm4m5\" (UniqueName: \"kubernetes.io/projected/858540f4-3108-4527-bf4f-d163b4f2c66f-kube-api-access-wm4m5\") pod \"placement-db-create-w29sg\" (UID: \"858540f4-3108-4527-bf4f-d163b4f2c66f\") " pod="openstack/placement-db-create-w29sg" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.649754 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1009b7c4-b07d-4be4-87a2-d82f1286dfc9-operator-scripts\") pod \"keystone-e78a-account-create-update-rbzcs\" (UID: \"1009b7c4-b07d-4be4-87a2-d82f1286dfc9\") " pod="openstack/keystone-e78a-account-create-update-rbzcs" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.709995 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-207e-account-create-update-ljmx9"] Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.711261 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-207e-account-create-update-ljmx9" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.713377 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.721243 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-207e-account-create-update-ljmx9"] Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.723695 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wp62q" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.752012 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/858540f4-3108-4527-bf4f-d163b4f2c66f-operator-scripts\") pod \"placement-db-create-w29sg\" (UID: \"858540f4-3108-4527-bf4f-d163b4f2c66f\") " pod="openstack/placement-db-create-w29sg" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.752083 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm4m5\" (UniqueName: \"kubernetes.io/projected/858540f4-3108-4527-bf4f-d163b4f2c66f-kube-api-access-wm4m5\") pod \"placement-db-create-w29sg\" (UID: \"858540f4-3108-4527-bf4f-d163b4f2c66f\") " pod="openstack/placement-db-create-w29sg" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.752111 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1009b7c4-b07d-4be4-87a2-d82f1286dfc9-operator-scripts\") pod \"keystone-e78a-account-create-update-rbzcs\" (UID: \"1009b7c4-b07d-4be4-87a2-d82f1286dfc9\") " pod="openstack/keystone-e78a-account-create-update-rbzcs" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.752144 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6eaa43b-c60c-415b-9efe-87f15bea768e-operator-scripts\") pod \"placement-207e-account-create-update-ljmx9\" (UID: \"e6eaa43b-c60c-415b-9efe-87f15bea768e\") " pod="openstack/placement-207e-account-create-update-ljmx9" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.752197 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z5fp\" (UniqueName: \"kubernetes.io/projected/1009b7c4-b07d-4be4-87a2-d82f1286dfc9-kube-api-access-2z5fp\") pod \"keystone-e78a-account-create-update-rbzcs\" (UID: \"1009b7c4-b07d-4be4-87a2-d82f1286dfc9\") " pod="openstack/keystone-e78a-account-create-update-rbzcs" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.752295 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzxss\" (UniqueName: \"kubernetes.io/projected/e6eaa43b-c60c-415b-9efe-87f15bea768e-kube-api-access-lzxss\") pod \"placement-207e-account-create-update-ljmx9\" (UID: \"e6eaa43b-c60c-415b-9efe-87f15bea768e\") " pod="openstack/placement-207e-account-create-update-ljmx9" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.753531 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/858540f4-3108-4527-bf4f-d163b4f2c66f-operator-scripts\") pod \"placement-db-create-w29sg\" (UID: \"858540f4-3108-4527-bf4f-d163b4f2c66f\") " pod="openstack/placement-db-create-w29sg" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.754776 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1009b7c4-b07d-4be4-87a2-d82f1286dfc9-operator-scripts\") pod \"keystone-e78a-account-create-update-rbzcs\" (UID: \"1009b7c4-b07d-4be4-87a2-d82f1286dfc9\") " pod="openstack/keystone-e78a-account-create-update-rbzcs" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.769419 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm4m5\" (UniqueName: \"kubernetes.io/projected/858540f4-3108-4527-bf4f-d163b4f2c66f-kube-api-access-wm4m5\") pod \"placement-db-create-w29sg\" (UID: \"858540f4-3108-4527-bf4f-d163b4f2c66f\") " pod="openstack/placement-db-create-w29sg" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.770702 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z5fp\" (UniqueName: \"kubernetes.io/projected/1009b7c4-b07d-4be4-87a2-d82f1286dfc9-kube-api-access-2z5fp\") pod \"keystone-e78a-account-create-update-rbzcs\" (UID: \"1009b7c4-b07d-4be4-87a2-d82f1286dfc9\") " pod="openstack/keystone-e78a-account-create-update-rbzcs" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.853511 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzxss\" (UniqueName: \"kubernetes.io/projected/e6eaa43b-c60c-415b-9efe-87f15bea768e-kube-api-access-lzxss\") pod \"placement-207e-account-create-update-ljmx9\" (UID: \"e6eaa43b-c60c-415b-9efe-87f15bea768e\") " pod="openstack/placement-207e-account-create-update-ljmx9" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.853624 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6eaa43b-c60c-415b-9efe-87f15bea768e-operator-scripts\") pod \"placement-207e-account-create-update-ljmx9\" (UID: \"e6eaa43b-c60c-415b-9efe-87f15bea768e\") " pod="openstack/placement-207e-account-create-update-ljmx9" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.854486 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6eaa43b-c60c-415b-9efe-87f15bea768e-operator-scripts\") pod \"placement-207e-account-create-update-ljmx9\" (UID: \"e6eaa43b-c60c-415b-9efe-87f15bea768e\") " pod="openstack/placement-207e-account-create-update-ljmx9" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.869114 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzxss\" (UniqueName: \"kubernetes.io/projected/e6eaa43b-c60c-415b-9efe-87f15bea768e-kube-api-access-lzxss\") pod \"placement-207e-account-create-update-ljmx9\" (UID: \"e6eaa43b-c60c-415b-9efe-87f15bea768e\") " pod="openstack/placement-207e-account-create-update-ljmx9" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.922550 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e78a-account-create-update-rbzcs" Feb 17 09:20:33 crc kubenswrapper[4848]: I0217 09:20:33.938135 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-w29sg" Feb 17 09:20:34 crc kubenswrapper[4848]: I0217 09:20:34.031433 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-207e-account-create-update-ljmx9" Feb 17 09:20:34 crc kubenswrapper[4848]: I0217 09:20:34.057620 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5bc15802-6def-48fe-8fd5-e6d85d068827-etc-swift\") pod \"swift-storage-0\" (UID: \"5bc15802-6def-48fe-8fd5-e6d85d068827\") " pod="openstack/swift-storage-0" Feb 17 09:20:34 crc kubenswrapper[4848]: E0217 09:20:34.057867 4848 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 17 09:20:34 crc kubenswrapper[4848]: E0217 09:20:34.057903 4848 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 17 09:20:34 crc kubenswrapper[4848]: E0217 09:20:34.057974 4848 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5bc15802-6def-48fe-8fd5-e6d85d068827-etc-swift podName:5bc15802-6def-48fe-8fd5-e6d85d068827 nodeName:}" failed. No retries permitted until 2026-02-17 09:20:42.057947869 +0000 UTC m=+919.601203515 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/5bc15802-6def-48fe-8fd5-e6d85d068827-etc-swift") pod "swift-storage-0" (UID: "5bc15802-6def-48fe-8fd5-e6d85d068827") : configmap "swift-ring-files" not found Feb 17 09:20:35 crc kubenswrapper[4848]: I0217 09:20:35.115478 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-e78a-account-create-update-rbzcs"] Feb 17 09:20:35 crc kubenswrapper[4848]: I0217 09:20:35.205392 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-5f5pj"] Feb 17 09:20:35 crc kubenswrapper[4848]: I0217 09:20:35.275711 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-wp62q"] Feb 17 09:20:35 crc kubenswrapper[4848]: I0217 09:20:35.289240 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-207e-account-create-update-ljmx9"] Feb 17 09:20:35 crc kubenswrapper[4848]: W0217 09:20:35.296554 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod858540f4_3108_4527_bf4f_d163b4f2c66f.slice/crio-90d63c2b3d603093b6b949c512b480406aa99b06cc3e25c163b515c4ad2b0ef8 WatchSource:0}: Error finding container 90d63c2b3d603093b6b949c512b480406aa99b06cc3e25c163b515c4ad2b0ef8: Status 404 returned error can't find the container with id 90d63c2b3d603093b6b949c512b480406aa99b06cc3e25c163b515c4ad2b0ef8 Feb 17 09:20:35 crc kubenswrapper[4848]: I0217 09:20:35.298715 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-w29sg"] Feb 17 09:20:35 crc kubenswrapper[4848]: I0217 09:20:35.407950 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" Feb 17 09:20:35 crc kubenswrapper[4848]: I0217 09:20:35.468173 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-7lgm2"] Feb 17 09:20:35 crc kubenswrapper[4848]: I0217 09:20:35.468458 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67ff45466c-7lgm2" podUID="84f8e59b-6699-4b32-8772-d9347fd21259" containerName="dnsmasq-dns" containerID="cri-o://5ef7f25940670c927395f21a3af49da52d6a2cd3cf0f6c7a25327fc3127e9ac1" gracePeriod=10 Feb 17 09:20:35 crc kubenswrapper[4848]: I0217 09:20:35.662117 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-5f5pj" event={"ID":"ed29dc41-db30-4792-8518-4ef61f232734","Type":"ContainerStarted","Data":"704e89191d3c83803ba77f28bfa1d9e3a8b4cf8141d5f2a801b258c12c3f15e0"} Feb 17 09:20:35 crc kubenswrapper[4848]: I0217 09:20:35.662990 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e78a-account-create-update-rbzcs" event={"ID":"1009b7c4-b07d-4be4-87a2-d82f1286dfc9","Type":"ContainerStarted","Data":"a53d6986604eff76aebbe414b310aa44c08533167c3fe819ed94be0663827152"} Feb 17 09:20:35 crc kubenswrapper[4848]: I0217 09:20:35.664060 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-w29sg" event={"ID":"858540f4-3108-4527-bf4f-d163b4f2c66f","Type":"ContainerStarted","Data":"90d63c2b3d603093b6b949c512b480406aa99b06cc3e25c163b515c4ad2b0ef8"} Feb 17 09:20:35 crc kubenswrapper[4848]: I0217 09:20:35.664996 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wp62q" event={"ID":"e60dd67a-6939-45a6-97f1-4f2e54bc4ca7","Type":"ContainerStarted","Data":"f6cc303d65867b156fcff9da9cb2748d7d7d0a9b368b9eb8701849f61ae0ee66"} Feb 17 09:20:35 crc kubenswrapper[4848]: I0217 09:20:35.667173 4848 generic.go:334] "Generic (PLEG): container finished" podID="84f8e59b-6699-4b32-8772-d9347fd21259" containerID="5ef7f25940670c927395f21a3af49da52d6a2cd3cf0f6c7a25327fc3127e9ac1" exitCode=0 Feb 17 09:20:35 crc kubenswrapper[4848]: I0217 09:20:35.667238 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-7lgm2" event={"ID":"84f8e59b-6699-4b32-8772-d9347fd21259","Type":"ContainerDied","Data":"5ef7f25940670c927395f21a3af49da52d6a2cd3cf0f6c7a25327fc3127e9ac1"} Feb 17 09:20:35 crc kubenswrapper[4848]: I0217 09:20:35.668430 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-c6lrl" event={"ID":"6a049c1c-b425-44cc-bde0-2e83be29d1a1","Type":"ContainerStarted","Data":"5051da85cd99a359857dc022d2ae6b25618d54b19371e2ebf2e16199090cf95b"} Feb 17 09:20:35 crc kubenswrapper[4848]: I0217 09:20:35.671746 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-207e-account-create-update-ljmx9" event={"ID":"e6eaa43b-c60c-415b-9efe-87f15bea768e","Type":"ContainerStarted","Data":"90e3ae32c9268697d2b215f3164e1a482ad7d44603556c1f6912f6d0cf627f85"} Feb 17 09:20:35 crc kubenswrapper[4848]: I0217 09:20:35.679702 4848 generic.go:334] "Generic (PLEG): container finished" podID="40074d0d-335c-4da0-900b-43a3cd3ea091" containerID="734a00d54f8585b8ab0770e94926ebf3696dda09f2f74442bf8dffed9904de51" exitCode=0 Feb 17 09:20:35 crc kubenswrapper[4848]: I0217 09:20:35.679748 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5g8g" event={"ID":"40074d0d-335c-4da0-900b-43a3cd3ea091","Type":"ContainerDied","Data":"734a00d54f8585b8ab0770e94926ebf3696dda09f2f74442bf8dffed9904de51"} Feb 17 09:20:35 crc kubenswrapper[4848]: I0217 09:20:35.690669 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-c6lrl" podStartSLOduration=1.966771912 podStartE2EDuration="5.690645968s" podCreationTimestamp="2026-02-17 09:20:30 +0000 UTC" firstStartedPulling="2026-02-17 09:20:30.968278183 +0000 UTC m=+908.511533829" lastFinishedPulling="2026-02-17 09:20:34.692152239 +0000 UTC m=+912.235407885" observedRunningTime="2026-02-17 09:20:35.682964875 +0000 UTC m=+913.226220531" watchObservedRunningTime="2026-02-17 09:20:35.690645968 +0000 UTC m=+913.233901614" Feb 17 09:20:35 crc kubenswrapper[4848]: I0217 09:20:35.809029 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-96bhn"] Feb 17 09:20:35 crc kubenswrapper[4848]: I0217 09:20:35.816032 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-96bhn"] Feb 17 09:20:36 crc kubenswrapper[4848]: I0217 09:20:36.714252 4848 generic.go:334] "Generic (PLEG): container finished" podID="e6eaa43b-c60c-415b-9efe-87f15bea768e" containerID="1ebd6ee74f3c2dd906a68a542dd080cd6e29e87511b6ed94d0faa2987b16f9e0" exitCode=0 Feb 17 09:20:36 crc kubenswrapper[4848]: I0217 09:20:36.714740 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-207e-account-create-update-ljmx9" event={"ID":"e6eaa43b-c60c-415b-9efe-87f15bea768e","Type":"ContainerDied","Data":"1ebd6ee74f3c2dd906a68a542dd080cd6e29e87511b6ed94d0faa2987b16f9e0"} Feb 17 09:20:36 crc kubenswrapper[4848]: I0217 09:20:36.724001 4848 generic.go:334] "Generic (PLEG): container finished" podID="1009b7c4-b07d-4be4-87a2-d82f1286dfc9" containerID="d212ef58fc2c276356a1df2d260d6e0cc739edb4585cfe26c2d093aa28ca4d07" exitCode=0 Feb 17 09:20:36 crc kubenswrapper[4848]: I0217 09:20:36.724082 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e78a-account-create-update-rbzcs" event={"ID":"1009b7c4-b07d-4be4-87a2-d82f1286dfc9","Type":"ContainerDied","Data":"d212ef58fc2c276356a1df2d260d6e0cc739edb4585cfe26c2d093aa28ca4d07"} Feb 17 09:20:36 crc kubenswrapper[4848]: I0217 09:20:36.739131 4848 generic.go:334] "Generic (PLEG): container finished" podID="858540f4-3108-4527-bf4f-d163b4f2c66f" containerID="86001ceea136307b0385e2ba29bfcab2f52608c6d2e715365152f8e5472e7f36" exitCode=0 Feb 17 09:20:36 crc kubenswrapper[4848]: I0217 09:20:36.739234 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-w29sg" event={"ID":"858540f4-3108-4527-bf4f-d163b4f2c66f","Type":"ContainerDied","Data":"86001ceea136307b0385e2ba29bfcab2f52608c6d2e715365152f8e5472e7f36"} Feb 17 09:20:36 crc kubenswrapper[4848]: I0217 09:20:36.741597 4848 generic.go:334] "Generic (PLEG): container finished" podID="e60dd67a-6939-45a6-97f1-4f2e54bc4ca7" containerID="c49b0a8baefdf1c945a466937de35cec98bb40ad145930dd6af1aa64c04531bb" exitCode=0 Feb 17 09:20:36 crc kubenswrapper[4848]: I0217 09:20:36.741636 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wp62q" event={"ID":"e60dd67a-6939-45a6-97f1-4f2e54bc4ca7","Type":"ContainerDied","Data":"c49b0a8baefdf1c945a466937de35cec98bb40ad145930dd6af1aa64c04531bb"} Feb 17 09:20:36 crc kubenswrapper[4848]: I0217 09:20:36.746474 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67ff45466c-7lgm2" event={"ID":"84f8e59b-6699-4b32-8772-d9347fd21259","Type":"ContainerDied","Data":"e131e7b00e99d1c78aee7159b862ef6c1f27b4abf0026fd0355a2d5a6ec3cb16"} Feb 17 09:20:36 crc kubenswrapper[4848]: I0217 09:20:36.746541 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e131e7b00e99d1c78aee7159b862ef6c1f27b4abf0026fd0355a2d5a6ec3cb16" Feb 17 09:20:36 crc kubenswrapper[4848]: I0217 09:20:36.768602 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-7lgm2" Feb 17 09:20:36 crc kubenswrapper[4848]: I0217 09:20:36.942417 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84f8e59b-6699-4b32-8772-d9347fd21259-config\") pod \"84f8e59b-6699-4b32-8772-d9347fd21259\" (UID: \"84f8e59b-6699-4b32-8772-d9347fd21259\") " Feb 17 09:20:36 crc kubenswrapper[4848]: I0217 09:20:36.942511 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84f8e59b-6699-4b32-8772-d9347fd21259-dns-svc\") pod \"84f8e59b-6699-4b32-8772-d9347fd21259\" (UID: \"84f8e59b-6699-4b32-8772-d9347fd21259\") " Feb 17 09:20:36 crc kubenswrapper[4848]: I0217 09:20:36.942552 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6vxr\" (UniqueName: \"kubernetes.io/projected/84f8e59b-6699-4b32-8772-d9347fd21259-kube-api-access-m6vxr\") pod \"84f8e59b-6699-4b32-8772-d9347fd21259\" (UID: \"84f8e59b-6699-4b32-8772-d9347fd21259\") " Feb 17 09:20:36 crc kubenswrapper[4848]: I0217 09:20:36.960095 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84f8e59b-6699-4b32-8772-d9347fd21259-kube-api-access-m6vxr" (OuterVolumeSpecName: "kube-api-access-m6vxr") pod "84f8e59b-6699-4b32-8772-d9347fd21259" (UID: "84f8e59b-6699-4b32-8772-d9347fd21259"). InnerVolumeSpecName "kube-api-access-m6vxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:20:36 crc kubenswrapper[4848]: I0217 09:20:36.983309 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84f8e59b-6699-4b32-8772-d9347fd21259-config" (OuterVolumeSpecName: "config") pod "84f8e59b-6699-4b32-8772-d9347fd21259" (UID: "84f8e59b-6699-4b32-8772-d9347fd21259"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:36 crc kubenswrapper[4848]: I0217 09:20:36.994534 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84f8e59b-6699-4b32-8772-d9347fd21259-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "84f8e59b-6699-4b32-8772-d9347fd21259" (UID: "84f8e59b-6699-4b32-8772-d9347fd21259"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:37 crc kubenswrapper[4848]: I0217 09:20:37.044271 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84f8e59b-6699-4b32-8772-d9347fd21259-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:37 crc kubenswrapper[4848]: I0217 09:20:37.044306 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84f8e59b-6699-4b32-8772-d9347fd21259-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:37 crc kubenswrapper[4848]: I0217 09:20:37.044319 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6vxr\" (UniqueName: \"kubernetes.io/projected/84f8e59b-6699-4b32-8772-d9347fd21259-kube-api-access-m6vxr\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:37 crc kubenswrapper[4848]: I0217 09:20:37.394832 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ffa0ee3-7860-4df2-9a5a-dd960a4851cd" path="/var/lib/kubelet/pods/7ffa0ee3-7860-4df2-9a5a-dd960a4851cd/volumes" Feb 17 09:20:37 crc kubenswrapper[4848]: I0217 09:20:37.757784 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5g8g" event={"ID":"40074d0d-335c-4da0-900b-43a3cd3ea091","Type":"ContainerStarted","Data":"3100b40b5089ed93bc96bacf9fb91c9bec3e59a77feca6893dfd3e063040c4e9"} Feb 17 09:20:37 crc kubenswrapper[4848]: I0217 09:20:37.757837 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67ff45466c-7lgm2" Feb 17 09:20:37 crc kubenswrapper[4848]: I0217 09:20:37.779447 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h5g8g" podStartSLOduration=2.93385657 podStartE2EDuration="7.779424865s" podCreationTimestamp="2026-02-17 09:20:30 +0000 UTC" firstStartedPulling="2026-02-17 09:20:31.583896682 +0000 UTC m=+909.127152318" lastFinishedPulling="2026-02-17 09:20:36.429464957 +0000 UTC m=+913.972720613" observedRunningTime="2026-02-17 09:20:37.775573823 +0000 UTC m=+915.318829469" watchObservedRunningTime="2026-02-17 09:20:37.779424865 +0000 UTC m=+915.322680531" Feb 17 09:20:37 crc kubenswrapper[4848]: I0217 09:20:37.815865 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-7lgm2"] Feb 17 09:20:37 crc kubenswrapper[4848]: I0217 09:20:37.824690 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67ff45466c-7lgm2"] Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.194662 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-207e-account-create-update-ljmx9" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.267340 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzxss\" (UniqueName: \"kubernetes.io/projected/e6eaa43b-c60c-415b-9efe-87f15bea768e-kube-api-access-lzxss\") pod \"e6eaa43b-c60c-415b-9efe-87f15bea768e\" (UID: \"e6eaa43b-c60c-415b-9efe-87f15bea768e\") " Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.267471 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6eaa43b-c60c-415b-9efe-87f15bea768e-operator-scripts\") pod \"e6eaa43b-c60c-415b-9efe-87f15bea768e\" (UID: \"e6eaa43b-c60c-415b-9efe-87f15bea768e\") " Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.268311 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6eaa43b-c60c-415b-9efe-87f15bea768e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e6eaa43b-c60c-415b-9efe-87f15bea768e" (UID: "e6eaa43b-c60c-415b-9efe-87f15bea768e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.273559 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6eaa43b-c60c-415b-9efe-87f15bea768e-kube-api-access-lzxss" (OuterVolumeSpecName: "kube-api-access-lzxss") pod "e6eaa43b-c60c-415b-9efe-87f15bea768e" (UID: "e6eaa43b-c60c-415b-9efe-87f15bea768e"). InnerVolumeSpecName "kube-api-access-lzxss". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.311409 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wp62q" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.319090 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e78a-account-create-update-rbzcs" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.341962 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-w29sg" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.368493 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e60dd67a-6939-45a6-97f1-4f2e54bc4ca7-operator-scripts\") pod \"e60dd67a-6939-45a6-97f1-4f2e54bc4ca7\" (UID: \"e60dd67a-6939-45a6-97f1-4f2e54bc4ca7\") " Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.368540 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plqh7\" (UniqueName: \"kubernetes.io/projected/e60dd67a-6939-45a6-97f1-4f2e54bc4ca7-kube-api-access-plqh7\") pod \"e60dd67a-6939-45a6-97f1-4f2e54bc4ca7\" (UID: \"e60dd67a-6939-45a6-97f1-4f2e54bc4ca7\") " Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.368570 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z5fp\" (UniqueName: \"kubernetes.io/projected/1009b7c4-b07d-4be4-87a2-d82f1286dfc9-kube-api-access-2z5fp\") pod \"1009b7c4-b07d-4be4-87a2-d82f1286dfc9\" (UID: \"1009b7c4-b07d-4be4-87a2-d82f1286dfc9\") " Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.368591 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm4m5\" (UniqueName: \"kubernetes.io/projected/858540f4-3108-4527-bf4f-d163b4f2c66f-kube-api-access-wm4m5\") pod \"858540f4-3108-4527-bf4f-d163b4f2c66f\" (UID: \"858540f4-3108-4527-bf4f-d163b4f2c66f\") " Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.368624 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/858540f4-3108-4527-bf4f-d163b4f2c66f-operator-scripts\") pod \"858540f4-3108-4527-bf4f-d163b4f2c66f\" (UID: \"858540f4-3108-4527-bf4f-d163b4f2c66f\") " Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.368657 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1009b7c4-b07d-4be4-87a2-d82f1286dfc9-operator-scripts\") pod \"1009b7c4-b07d-4be4-87a2-d82f1286dfc9\" (UID: \"1009b7c4-b07d-4be4-87a2-d82f1286dfc9\") " Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.368917 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzxss\" (UniqueName: \"kubernetes.io/projected/e6eaa43b-c60c-415b-9efe-87f15bea768e-kube-api-access-lzxss\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.368929 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e6eaa43b-c60c-415b-9efe-87f15bea768e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.373988 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1009b7c4-b07d-4be4-87a2-d82f1286dfc9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1009b7c4-b07d-4be4-87a2-d82f1286dfc9" (UID: "1009b7c4-b07d-4be4-87a2-d82f1286dfc9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.374550 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e60dd67a-6939-45a6-97f1-4f2e54bc4ca7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e60dd67a-6939-45a6-97f1-4f2e54bc4ca7" (UID: "e60dd67a-6939-45a6-97f1-4f2e54bc4ca7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.375724 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/858540f4-3108-4527-bf4f-d163b4f2c66f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "858540f4-3108-4527-bf4f-d163b4f2c66f" (UID: "858540f4-3108-4527-bf4f-d163b4f2c66f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.375944 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1009b7c4-b07d-4be4-87a2-d82f1286dfc9-kube-api-access-2z5fp" (OuterVolumeSpecName: "kube-api-access-2z5fp") pod "1009b7c4-b07d-4be4-87a2-d82f1286dfc9" (UID: "1009b7c4-b07d-4be4-87a2-d82f1286dfc9"). InnerVolumeSpecName "kube-api-access-2z5fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.377871 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/858540f4-3108-4527-bf4f-d163b4f2c66f-kube-api-access-wm4m5" (OuterVolumeSpecName: "kube-api-access-wm4m5") pod "858540f4-3108-4527-bf4f-d163b4f2c66f" (UID: "858540f4-3108-4527-bf4f-d163b4f2c66f"). InnerVolumeSpecName "kube-api-access-wm4m5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.393076 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e60dd67a-6939-45a6-97f1-4f2e54bc4ca7-kube-api-access-plqh7" (OuterVolumeSpecName: "kube-api-access-plqh7") pod "e60dd67a-6939-45a6-97f1-4f2e54bc4ca7" (UID: "e60dd67a-6939-45a6-97f1-4f2e54bc4ca7"). InnerVolumeSpecName "kube-api-access-plqh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.437370 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.470847 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z5fp\" (UniqueName: \"kubernetes.io/projected/1009b7c4-b07d-4be4-87a2-d82f1286dfc9-kube-api-access-2z5fp\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.470881 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm4m5\" (UniqueName: \"kubernetes.io/projected/858540f4-3108-4527-bf4f-d163b4f2c66f-kube-api-access-wm4m5\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.470891 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/858540f4-3108-4527-bf4f-d163b4f2c66f-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.470900 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1009b7c4-b07d-4be4-87a2-d82f1286dfc9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.470910 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e60dd67a-6939-45a6-97f1-4f2e54bc4ca7-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.470918 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plqh7\" (UniqueName: \"kubernetes.io/projected/e60dd67a-6939-45a6-97f1-4f2e54bc4ca7-kube-api-access-plqh7\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.768563 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-w29sg" event={"ID":"858540f4-3108-4527-bf4f-d163b4f2c66f","Type":"ContainerDied","Data":"90d63c2b3d603093b6b949c512b480406aa99b06cc3e25c163b515c4ad2b0ef8"} Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.768580 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-w29sg" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.768597 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90d63c2b3d603093b6b949c512b480406aa99b06cc3e25c163b515c4ad2b0ef8" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.770587 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wp62q" event={"ID":"e60dd67a-6939-45a6-97f1-4f2e54bc4ca7","Type":"ContainerDied","Data":"f6cc303d65867b156fcff9da9cb2748d7d7d0a9b368b9eb8701849f61ae0ee66"} Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.770623 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6cc303d65867b156fcff9da9cb2748d7d7d0a9b368b9eb8701849f61ae0ee66" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.770713 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wp62q" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.781235 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-207e-account-create-update-ljmx9" event={"ID":"e6eaa43b-c60c-415b-9efe-87f15bea768e","Type":"ContainerDied","Data":"90e3ae32c9268697d2b215f3164e1a482ad7d44603556c1f6912f6d0cf627f85"} Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.781284 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-207e-account-create-update-ljmx9" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.781290 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90e3ae32c9268697d2b215f3164e1a482ad7d44603556c1f6912f6d0cf627f85" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.785482 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-e78a-account-create-update-rbzcs" Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.785624 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-e78a-account-create-update-rbzcs" event={"ID":"1009b7c4-b07d-4be4-87a2-d82f1286dfc9","Type":"ContainerDied","Data":"a53d6986604eff76aebbe414b310aa44c08533167c3fe819ed94be0663827152"} Feb 17 09:20:38 crc kubenswrapper[4848]: I0217 09:20:38.785687 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a53d6986604eff76aebbe414b310aa44c08533167c3fe819ed94be0663827152" Feb 17 09:20:39 crc kubenswrapper[4848]: I0217 09:20:39.402910 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84f8e59b-6699-4b32-8772-d9347fd21259" path="/var/lib/kubelet/pods/84f8e59b-6699-4b32-8772-d9347fd21259/volumes" Feb 17 09:20:40 crc kubenswrapper[4848]: I0217 09:20:40.600713 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h5g8g" Feb 17 09:20:40 crc kubenswrapper[4848]: I0217 09:20:40.600983 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h5g8g" Feb 17 09:20:40 crc kubenswrapper[4848]: I0217 09:20:40.654751 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h5g8g" Feb 17 09:20:40 crc kubenswrapper[4848]: I0217 09:20:40.794733 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-884bt"] Feb 17 09:20:40 crc kubenswrapper[4848]: E0217 09:20:40.798781 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84f8e59b-6699-4b32-8772-d9347fd21259" containerName="dnsmasq-dns" Feb 17 09:20:40 crc kubenswrapper[4848]: I0217 09:20:40.798808 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="84f8e59b-6699-4b32-8772-d9347fd21259" containerName="dnsmasq-dns" Feb 17 09:20:40 crc kubenswrapper[4848]: E0217 09:20:40.798823 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="858540f4-3108-4527-bf4f-d163b4f2c66f" containerName="mariadb-database-create" Feb 17 09:20:40 crc kubenswrapper[4848]: I0217 09:20:40.798832 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="858540f4-3108-4527-bf4f-d163b4f2c66f" containerName="mariadb-database-create" Feb 17 09:20:40 crc kubenswrapper[4848]: E0217 09:20:40.798854 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1009b7c4-b07d-4be4-87a2-d82f1286dfc9" containerName="mariadb-account-create-update" Feb 17 09:20:40 crc kubenswrapper[4848]: I0217 09:20:40.798864 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="1009b7c4-b07d-4be4-87a2-d82f1286dfc9" containerName="mariadb-account-create-update" Feb 17 09:20:40 crc kubenswrapper[4848]: E0217 09:20:40.798883 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6eaa43b-c60c-415b-9efe-87f15bea768e" containerName="mariadb-account-create-update" Feb 17 09:20:40 crc kubenswrapper[4848]: I0217 09:20:40.798890 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6eaa43b-c60c-415b-9efe-87f15bea768e" containerName="mariadb-account-create-update" Feb 17 09:20:40 crc kubenswrapper[4848]: E0217 09:20:40.798901 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84f8e59b-6699-4b32-8772-d9347fd21259" containerName="init" Feb 17 09:20:40 crc kubenswrapper[4848]: I0217 09:20:40.798908 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="84f8e59b-6699-4b32-8772-d9347fd21259" containerName="init" Feb 17 09:20:40 crc kubenswrapper[4848]: E0217 09:20:40.798917 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e60dd67a-6939-45a6-97f1-4f2e54bc4ca7" containerName="mariadb-database-create" Feb 17 09:20:40 crc kubenswrapper[4848]: I0217 09:20:40.798924 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="e60dd67a-6939-45a6-97f1-4f2e54bc4ca7" containerName="mariadb-database-create" Feb 17 09:20:40 crc kubenswrapper[4848]: I0217 09:20:40.799111 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="858540f4-3108-4527-bf4f-d163b4f2c66f" containerName="mariadb-database-create" Feb 17 09:20:40 crc kubenswrapper[4848]: I0217 09:20:40.799129 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="84f8e59b-6699-4b32-8772-d9347fd21259" containerName="dnsmasq-dns" Feb 17 09:20:40 crc kubenswrapper[4848]: I0217 09:20:40.799140 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="e60dd67a-6939-45a6-97f1-4f2e54bc4ca7" containerName="mariadb-database-create" Feb 17 09:20:40 crc kubenswrapper[4848]: I0217 09:20:40.799149 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6eaa43b-c60c-415b-9efe-87f15bea768e" containerName="mariadb-account-create-update" Feb 17 09:20:40 crc kubenswrapper[4848]: I0217 09:20:40.799158 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="1009b7c4-b07d-4be4-87a2-d82f1286dfc9" containerName="mariadb-account-create-update" Feb 17 09:20:40 crc kubenswrapper[4848]: I0217 09:20:40.802112 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-884bt" Feb 17 09:20:40 crc kubenswrapper[4848]: I0217 09:20:40.811905 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 17 09:20:40 crc kubenswrapper[4848]: I0217 09:20:40.812030 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-884bt"] Feb 17 09:20:40 crc kubenswrapper[4848]: I0217 09:20:40.812663 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh8mn\" (UniqueName: \"kubernetes.io/projected/4958099b-fe10-4fd4-abaa-00d1520eda93-kube-api-access-gh8mn\") pod \"root-account-create-update-884bt\" (UID: \"4958099b-fe10-4fd4-abaa-00d1520eda93\") " pod="openstack/root-account-create-update-884bt" Feb 17 09:20:40 crc kubenswrapper[4848]: I0217 09:20:40.812709 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4958099b-fe10-4fd4-abaa-00d1520eda93-operator-scripts\") pod \"root-account-create-update-884bt\" (UID: \"4958099b-fe10-4fd4-abaa-00d1520eda93\") " pod="openstack/root-account-create-update-884bt" Feb 17 09:20:40 crc kubenswrapper[4848]: I0217 09:20:40.914699 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh8mn\" (UniqueName: \"kubernetes.io/projected/4958099b-fe10-4fd4-abaa-00d1520eda93-kube-api-access-gh8mn\") pod \"root-account-create-update-884bt\" (UID: \"4958099b-fe10-4fd4-abaa-00d1520eda93\") " pod="openstack/root-account-create-update-884bt" Feb 17 09:20:40 crc kubenswrapper[4848]: I0217 09:20:40.914790 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4958099b-fe10-4fd4-abaa-00d1520eda93-operator-scripts\") pod \"root-account-create-update-884bt\" (UID: \"4958099b-fe10-4fd4-abaa-00d1520eda93\") " pod="openstack/root-account-create-update-884bt" Feb 17 09:20:40 crc kubenswrapper[4848]: I0217 09:20:40.915644 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4958099b-fe10-4fd4-abaa-00d1520eda93-operator-scripts\") pod \"root-account-create-update-884bt\" (UID: \"4958099b-fe10-4fd4-abaa-00d1520eda93\") " pod="openstack/root-account-create-update-884bt" Feb 17 09:20:40 crc kubenswrapper[4848]: I0217 09:20:40.944946 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh8mn\" (UniqueName: \"kubernetes.io/projected/4958099b-fe10-4fd4-abaa-00d1520eda93-kube-api-access-gh8mn\") pod \"root-account-create-update-884bt\" (UID: \"4958099b-fe10-4fd4-abaa-00d1520eda93\") " pod="openstack/root-account-create-update-884bt" Feb 17 09:20:41 crc kubenswrapper[4848]: I0217 09:20:41.134699 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-884bt" Feb 17 09:20:41 crc kubenswrapper[4848]: I0217 09:20:41.814961 4848 generic.go:334] "Generic (PLEG): container finished" podID="6a049c1c-b425-44cc-bde0-2e83be29d1a1" containerID="5051da85cd99a359857dc022d2ae6b25618d54b19371e2ebf2e16199090cf95b" exitCode=0 Feb 17 09:20:41 crc kubenswrapper[4848]: I0217 09:20:41.815001 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-c6lrl" event={"ID":"6a049c1c-b425-44cc-bde0-2e83be29d1a1","Type":"ContainerDied","Data":"5051da85cd99a359857dc022d2ae6b25618d54b19371e2ebf2e16199090cf95b"} Feb 17 09:20:42 crc kubenswrapper[4848]: I0217 09:20:42.135483 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5bc15802-6def-48fe-8fd5-e6d85d068827-etc-swift\") pod \"swift-storage-0\" (UID: \"5bc15802-6def-48fe-8fd5-e6d85d068827\") " pod="openstack/swift-storage-0" Feb 17 09:20:42 crc kubenswrapper[4848]: I0217 09:20:42.143946 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5bc15802-6def-48fe-8fd5-e6d85d068827-etc-swift\") pod \"swift-storage-0\" (UID: \"5bc15802-6def-48fe-8fd5-e6d85d068827\") " pod="openstack/swift-storage-0" Feb 17 09:20:42 crc kubenswrapper[4848]: I0217 09:20:42.431547 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 17 09:20:42 crc kubenswrapper[4848]: I0217 09:20:42.729602 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-c695f" podUID="43e80552-f64e-4257-a460-f108ee513c12" containerName="ovn-controller" probeResult="failure" output=< Feb 17 09:20:42 crc kubenswrapper[4848]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 17 09:20:42 crc kubenswrapper[4848]: > Feb 17 09:20:42 crc kubenswrapper[4848]: I0217 09:20:42.772329 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-jbwkv" Feb 17 09:20:44 crc kubenswrapper[4848]: I0217 09:20:44.838618 4848 generic.go:334] "Generic (PLEG): container finished" podID="db50eaa9-ca0a-4a83-98d8-fce82f849d91" containerID="812daf3e743d62ecf6c5cde4e775fa48bf1f6c9b01408f073084719ba69e530f" exitCode=0 Feb 17 09:20:44 crc kubenswrapper[4848]: I0217 09:20:44.839205 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"db50eaa9-ca0a-4a83-98d8-fce82f849d91","Type":"ContainerDied","Data":"812daf3e743d62ecf6c5cde4e775fa48bf1f6c9b01408f073084719ba69e530f"} Feb 17 09:20:44 crc kubenswrapper[4848]: I0217 09:20:44.842241 4848 generic.go:334] "Generic (PLEG): container finished" podID="bd7e9b9b-99f0-4720-b997-3f00996972e5" containerID="ae0120d0e18108904cbf2f561462654c7c922d6913d36d2d77ef7a061136de71" exitCode=0 Feb 17 09:20:44 crc kubenswrapper[4848]: I0217 09:20:44.842282 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd7e9b9b-99f0-4720-b997-3f00996972e5","Type":"ContainerDied","Data":"ae0120d0e18108904cbf2f561462654c7c922d6913d36d2d77ef7a061136de71"} Feb 17 09:20:47 crc kubenswrapper[4848]: I0217 09:20:47.724859 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-c695f" podUID="43e80552-f64e-4257-a460-f108ee513c12" containerName="ovn-controller" probeResult="failure" output=< Feb 17 09:20:47 crc kubenswrapper[4848]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 17 09:20:47 crc kubenswrapper[4848]: > Feb 17 09:20:47 crc kubenswrapper[4848]: I0217 09:20:47.754313 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-jbwkv" Feb 17 09:20:47 crc kubenswrapper[4848]: I0217 09:20:47.988297 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-c695f-config-xtpkb"] Feb 17 09:20:47 crc kubenswrapper[4848]: I0217 09:20:47.990114 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c695f-config-xtpkb" Feb 17 09:20:47 crc kubenswrapper[4848]: I0217 09:20:47.993359 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.000275 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-c695f-config-xtpkb"] Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.039887 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/60cf473e-4a59-468f-9cf6-963b24f7d34b-var-log-ovn\") pod \"ovn-controller-c695f-config-xtpkb\" (UID: \"60cf473e-4a59-468f-9cf6-963b24f7d34b\") " pod="openstack/ovn-controller-c695f-config-xtpkb" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.040005 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/60cf473e-4a59-468f-9cf6-963b24f7d34b-var-run-ovn\") pod \"ovn-controller-c695f-config-xtpkb\" (UID: \"60cf473e-4a59-468f-9cf6-963b24f7d34b\") " pod="openstack/ovn-controller-c695f-config-xtpkb" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.040059 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/60cf473e-4a59-468f-9cf6-963b24f7d34b-var-run\") pod \"ovn-controller-c695f-config-xtpkb\" (UID: \"60cf473e-4a59-468f-9cf6-963b24f7d34b\") " pod="openstack/ovn-controller-c695f-config-xtpkb" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.040091 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/60cf473e-4a59-468f-9cf6-963b24f7d34b-additional-scripts\") pod \"ovn-controller-c695f-config-xtpkb\" (UID: \"60cf473e-4a59-468f-9cf6-963b24f7d34b\") " pod="openstack/ovn-controller-c695f-config-xtpkb" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.040135 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djng7\" (UniqueName: \"kubernetes.io/projected/60cf473e-4a59-468f-9cf6-963b24f7d34b-kube-api-access-djng7\") pod \"ovn-controller-c695f-config-xtpkb\" (UID: \"60cf473e-4a59-468f-9cf6-963b24f7d34b\") " pod="openstack/ovn-controller-c695f-config-xtpkb" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.040180 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60cf473e-4a59-468f-9cf6-963b24f7d34b-scripts\") pod \"ovn-controller-c695f-config-xtpkb\" (UID: \"60cf473e-4a59-468f-9cf6-963b24f7d34b\") " pod="openstack/ovn-controller-c695f-config-xtpkb" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.141456 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/60cf473e-4a59-468f-9cf6-963b24f7d34b-var-run-ovn\") pod \"ovn-controller-c695f-config-xtpkb\" (UID: \"60cf473e-4a59-468f-9cf6-963b24f7d34b\") " pod="openstack/ovn-controller-c695f-config-xtpkb" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.141508 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/60cf473e-4a59-468f-9cf6-963b24f7d34b-var-run\") pod \"ovn-controller-c695f-config-xtpkb\" (UID: \"60cf473e-4a59-468f-9cf6-963b24f7d34b\") " pod="openstack/ovn-controller-c695f-config-xtpkb" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.141530 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/60cf473e-4a59-468f-9cf6-963b24f7d34b-additional-scripts\") pod \"ovn-controller-c695f-config-xtpkb\" (UID: \"60cf473e-4a59-468f-9cf6-963b24f7d34b\") " pod="openstack/ovn-controller-c695f-config-xtpkb" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.141564 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djng7\" (UniqueName: \"kubernetes.io/projected/60cf473e-4a59-468f-9cf6-963b24f7d34b-kube-api-access-djng7\") pod \"ovn-controller-c695f-config-xtpkb\" (UID: \"60cf473e-4a59-468f-9cf6-963b24f7d34b\") " pod="openstack/ovn-controller-c695f-config-xtpkb" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.141597 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60cf473e-4a59-468f-9cf6-963b24f7d34b-scripts\") pod \"ovn-controller-c695f-config-xtpkb\" (UID: \"60cf473e-4a59-468f-9cf6-963b24f7d34b\") " pod="openstack/ovn-controller-c695f-config-xtpkb" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.141648 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/60cf473e-4a59-468f-9cf6-963b24f7d34b-var-log-ovn\") pod \"ovn-controller-c695f-config-xtpkb\" (UID: \"60cf473e-4a59-468f-9cf6-963b24f7d34b\") " pod="openstack/ovn-controller-c695f-config-xtpkb" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.141806 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/60cf473e-4a59-468f-9cf6-963b24f7d34b-var-log-ovn\") pod \"ovn-controller-c695f-config-xtpkb\" (UID: \"60cf473e-4a59-468f-9cf6-963b24f7d34b\") " pod="openstack/ovn-controller-c695f-config-xtpkb" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.141816 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/60cf473e-4a59-468f-9cf6-963b24f7d34b-var-run-ovn\") pod \"ovn-controller-c695f-config-xtpkb\" (UID: \"60cf473e-4a59-468f-9cf6-963b24f7d34b\") " pod="openstack/ovn-controller-c695f-config-xtpkb" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.142110 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/60cf473e-4a59-468f-9cf6-963b24f7d34b-var-run\") pod \"ovn-controller-c695f-config-xtpkb\" (UID: \"60cf473e-4a59-468f-9cf6-963b24f7d34b\") " pod="openstack/ovn-controller-c695f-config-xtpkb" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.142348 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/60cf473e-4a59-468f-9cf6-963b24f7d34b-additional-scripts\") pod \"ovn-controller-c695f-config-xtpkb\" (UID: \"60cf473e-4a59-468f-9cf6-963b24f7d34b\") " pod="openstack/ovn-controller-c695f-config-xtpkb" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.143495 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60cf473e-4a59-468f-9cf6-963b24f7d34b-scripts\") pod \"ovn-controller-c695f-config-xtpkb\" (UID: \"60cf473e-4a59-468f-9cf6-963b24f7d34b\") " pod="openstack/ovn-controller-c695f-config-xtpkb" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.161253 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djng7\" (UniqueName: \"kubernetes.io/projected/60cf473e-4a59-468f-9cf6-963b24f7d34b-kube-api-access-djng7\") pod \"ovn-controller-c695f-config-xtpkb\" (UID: \"60cf473e-4a59-468f-9cf6-963b24f7d34b\") " pod="openstack/ovn-controller-c695f-config-xtpkb" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.310290 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c695f-config-xtpkb" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.723470 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-c6lrl" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.851865 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a049c1c-b425-44cc-bde0-2e83be29d1a1-scripts\") pod \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.851925 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6a049c1c-b425-44cc-bde0-2e83be29d1a1-dispersionconf\") pod \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.851988 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntvrw\" (UniqueName: \"kubernetes.io/projected/6a049c1c-b425-44cc-bde0-2e83be29d1a1-kube-api-access-ntvrw\") pod \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.852015 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6a049c1c-b425-44cc-bde0-2e83be29d1a1-ring-data-devices\") pod \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.852037 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6a049c1c-b425-44cc-bde0-2e83be29d1a1-etc-swift\") pod \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.852075 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a049c1c-b425-44cc-bde0-2e83be29d1a1-combined-ca-bundle\") pod \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.852148 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6a049c1c-b425-44cc-bde0-2e83be29d1a1-swiftconf\") pod \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\" (UID: \"6a049c1c-b425-44cc-bde0-2e83be29d1a1\") " Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.853966 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a049c1c-b425-44cc-bde0-2e83be29d1a1-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6a049c1c-b425-44cc-bde0-2e83be29d1a1" (UID: "6a049c1c-b425-44cc-bde0-2e83be29d1a1"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.854936 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a049c1c-b425-44cc-bde0-2e83be29d1a1-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6a049c1c-b425-44cc-bde0-2e83be29d1a1" (UID: "6a049c1c-b425-44cc-bde0-2e83be29d1a1"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.856087 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a049c1c-b425-44cc-bde0-2e83be29d1a1-kube-api-access-ntvrw" (OuterVolumeSpecName: "kube-api-access-ntvrw") pod "6a049c1c-b425-44cc-bde0-2e83be29d1a1" (UID: "6a049c1c-b425-44cc-bde0-2e83be29d1a1"). InnerVolumeSpecName "kube-api-access-ntvrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.862436 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a049c1c-b425-44cc-bde0-2e83be29d1a1-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6a049c1c-b425-44cc-bde0-2e83be29d1a1" (UID: "6a049c1c-b425-44cc-bde0-2e83be29d1a1"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.872592 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a049c1c-b425-44cc-bde0-2e83be29d1a1-scripts" (OuterVolumeSpecName: "scripts") pod "6a049c1c-b425-44cc-bde0-2e83be29d1a1" (UID: "6a049c1c-b425-44cc-bde0-2e83be29d1a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.874263 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a049c1c-b425-44cc-bde0-2e83be29d1a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a049c1c-b425-44cc-bde0-2e83be29d1a1" (UID: "6a049c1c-b425-44cc-bde0-2e83be29d1a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.878592 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"db50eaa9-ca0a-4a83-98d8-fce82f849d91","Type":"ContainerStarted","Data":"89c4b7e4ebba0e5a926ad66e26e6c279acfd3e58d484e76f931ec2ced0b778c0"} Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.879289 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.882294 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd7e9b9b-99f0-4720-b997-3f00996972e5","Type":"ContainerStarted","Data":"db8066450874afb857859ed156351663263e471d9f54d2e2df0c4b84bad94d39"} Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.882885 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.884307 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a049c1c-b425-44cc-bde0-2e83be29d1a1-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6a049c1c-b425-44cc-bde0-2e83be29d1a1" (UID: "6a049c1c-b425-44cc-bde0-2e83be29d1a1"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.890288 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-c6lrl" event={"ID":"6a049c1c-b425-44cc-bde0-2e83be29d1a1","Type":"ContainerDied","Data":"d27d6afc12569fe024295ee55da2956e918b5a429825582e6d4e8b3a01b1f988"} Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.890317 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d27d6afc12569fe024295ee55da2956e918b5a429825582e6d4e8b3a01b1f988" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.890370 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-c6lrl" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.902166 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=54.347342908 podStartE2EDuration="1m1.90212163s" podCreationTimestamp="2026-02-17 09:19:47 +0000 UTC" firstStartedPulling="2026-02-17 09:20:02.356139805 +0000 UTC m=+879.899395451" lastFinishedPulling="2026-02-17 09:20:09.910918527 +0000 UTC m=+887.454174173" observedRunningTime="2026-02-17 09:20:48.894433857 +0000 UTC m=+926.437689503" watchObservedRunningTime="2026-02-17 09:20:48.90212163 +0000 UTC m=+926.445377296" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.927231 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=53.066792499 podStartE2EDuration="1m0.927214479s" podCreationTimestamp="2026-02-17 09:19:48 +0000 UTC" firstStartedPulling="2026-02-17 09:20:02.357195546 +0000 UTC m=+879.900451192" lastFinishedPulling="2026-02-17 09:20:10.217617526 +0000 UTC m=+887.760873172" observedRunningTime="2026-02-17 09:20:48.918415623 +0000 UTC m=+926.461671269" watchObservedRunningTime="2026-02-17 09:20:48.927214479 +0000 UTC m=+926.470470125" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.982663 4848 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6a049c1c-b425-44cc-bde0-2e83be29d1a1-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.982688 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a049c1c-b425-44cc-bde0-2e83be29d1a1-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.982697 4848 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6a049c1c-b425-44cc-bde0-2e83be29d1a1-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.982706 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntvrw\" (UniqueName: \"kubernetes.io/projected/6a049c1c-b425-44cc-bde0-2e83be29d1a1-kube-api-access-ntvrw\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.982715 4848 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6a049c1c-b425-44cc-bde0-2e83be29d1a1-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.982723 4848 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6a049c1c-b425-44cc-bde0-2e83be29d1a1-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:48 crc kubenswrapper[4848]: I0217 09:20:48.982731 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a049c1c-b425-44cc-bde0-2e83be29d1a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:49 crc kubenswrapper[4848]: I0217 09:20:49.090273 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-c695f-config-xtpkb"] Feb 17 09:20:49 crc kubenswrapper[4848]: I0217 09:20:49.095996 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-884bt"] Feb 17 09:20:49 crc kubenswrapper[4848]: W0217 09:20:49.104534 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4958099b_fe10_4fd4_abaa_00d1520eda93.slice/crio-54fe8d1f13a58d67a428904dbaf956af02b369690ce05198f1b6b41c96899196 WatchSource:0}: Error finding container 54fe8d1f13a58d67a428904dbaf956af02b369690ce05198f1b6b41c96899196: Status 404 returned error can't find the container with id 54fe8d1f13a58d67a428904dbaf956af02b369690ce05198f1b6b41c96899196 Feb 17 09:20:49 crc kubenswrapper[4848]: I0217 09:20:49.178730 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 17 09:20:49 crc kubenswrapper[4848]: W0217 09:20:49.193472 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bc15802_6def_48fe_8fd5_e6d85d068827.slice/crio-10e045ff363ff5b1b3bca37de562c69c119f255c4e1c757a6ef74be3245a380d WatchSource:0}: Error finding container 10e045ff363ff5b1b3bca37de562c69c119f255c4e1c757a6ef74be3245a380d: Status 404 returned error can't find the container with id 10e045ff363ff5b1b3bca37de562c69c119f255c4e1c757a6ef74be3245a380d Feb 17 09:20:49 crc kubenswrapper[4848]: I0217 09:20:49.898588 4848 generic.go:334] "Generic (PLEG): container finished" podID="4958099b-fe10-4fd4-abaa-00d1520eda93" containerID="946980ceb65edcae562802ae21a2cb294f5a034371daa6644585de205d1fc1b3" exitCode=0 Feb 17 09:20:49 crc kubenswrapper[4848]: I0217 09:20:49.898690 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-884bt" event={"ID":"4958099b-fe10-4fd4-abaa-00d1520eda93","Type":"ContainerDied","Data":"946980ceb65edcae562802ae21a2cb294f5a034371daa6644585de205d1fc1b3"} Feb 17 09:20:49 crc kubenswrapper[4848]: I0217 09:20:49.900197 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-884bt" event={"ID":"4958099b-fe10-4fd4-abaa-00d1520eda93","Type":"ContainerStarted","Data":"54fe8d1f13a58d67a428904dbaf956af02b369690ce05198f1b6b41c96899196"} Feb 17 09:20:49 crc kubenswrapper[4848]: I0217 09:20:49.902488 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5bc15802-6def-48fe-8fd5-e6d85d068827","Type":"ContainerStarted","Data":"10e045ff363ff5b1b3bca37de562c69c119f255c4e1c757a6ef74be3245a380d"} Feb 17 09:20:49 crc kubenswrapper[4848]: I0217 09:20:49.904174 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-5f5pj" event={"ID":"ed29dc41-db30-4792-8518-4ef61f232734","Type":"ContainerStarted","Data":"252f99d92bd4bf0bb1e005048fd69d1f82f58219b2756de4285dcada4bdb9adc"} Feb 17 09:20:49 crc kubenswrapper[4848]: I0217 09:20:49.906534 4848 generic.go:334] "Generic (PLEG): container finished" podID="60cf473e-4a59-468f-9cf6-963b24f7d34b" containerID="53a02f43c94be8c0acb29110d602ddec1d1152e20f6142c0b68c52c159cdf7c9" exitCode=0 Feb 17 09:20:49 crc kubenswrapper[4848]: I0217 09:20:49.906830 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c695f-config-xtpkb" event={"ID":"60cf473e-4a59-468f-9cf6-963b24f7d34b","Type":"ContainerDied","Data":"53a02f43c94be8c0acb29110d602ddec1d1152e20f6142c0b68c52c159cdf7c9"} Feb 17 09:20:49 crc kubenswrapper[4848]: I0217 09:20:49.906866 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c695f-config-xtpkb" event={"ID":"60cf473e-4a59-468f-9cf6-963b24f7d34b","Type":"ContainerStarted","Data":"4241089d3dee6805083a2d3bf1031d72ca65c65e2b0948bde505b3e25acf3387"} Feb 17 09:20:49 crc kubenswrapper[4848]: I0217 09:20:49.960521 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-5f5pj" podStartSLOduration=4.47682132 podStartE2EDuration="17.960498708s" podCreationTimestamp="2026-02-17 09:20:32 +0000 UTC" firstStartedPulling="2026-02-17 09:20:35.214566601 +0000 UTC m=+912.757822247" lastFinishedPulling="2026-02-17 09:20:48.698243989 +0000 UTC m=+926.241499635" observedRunningTime="2026-02-17 09:20:49.95300522 +0000 UTC m=+927.496260916" watchObservedRunningTime="2026-02-17 09:20:49.960498708 +0000 UTC m=+927.503754364" Feb 17 09:20:50 crc kubenswrapper[4848]: I0217 09:20:50.658291 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h5g8g" Feb 17 09:20:50 crc kubenswrapper[4848]: I0217 09:20:50.704093 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h5g8g"] Feb 17 09:20:50 crc kubenswrapper[4848]: I0217 09:20:50.915562 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5bc15802-6def-48fe-8fd5-e6d85d068827","Type":"ContainerStarted","Data":"1596412f3912dd67c57b0faeac96c9aebd0cd4100681998d7e78b8a7acf53fa4"} Feb 17 09:20:50 crc kubenswrapper[4848]: I0217 09:20:50.915935 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5bc15802-6def-48fe-8fd5-e6d85d068827","Type":"ContainerStarted","Data":"6ccac0c551e6ccd2e729011ab1d78d9eadf5e487821001e5acd0b8a63615211b"} Feb 17 09:20:50 crc kubenswrapper[4848]: I0217 09:20:50.915946 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5bc15802-6def-48fe-8fd5-e6d85d068827","Type":"ContainerStarted","Data":"eebceaec7a754ff499e6ecfde67f4c564bc0183843e434809f0ec89fa0644098"} Feb 17 09:20:50 crc kubenswrapper[4848]: I0217 09:20:50.916121 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h5g8g" podUID="40074d0d-335c-4da0-900b-43a3cd3ea091" containerName="registry-server" containerID="cri-o://3100b40b5089ed93bc96bacf9fb91c9bec3e59a77feca6893dfd3e063040c4e9" gracePeriod=2 Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.399078 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-884bt" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.497275 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5g8g" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.501000 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c695f-config-xtpkb" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.526799 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4958099b-fe10-4fd4-abaa-00d1520eda93-operator-scripts\") pod \"4958099b-fe10-4fd4-abaa-00d1520eda93\" (UID: \"4958099b-fe10-4fd4-abaa-00d1520eda93\") " Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.526873 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh8mn\" (UniqueName: \"kubernetes.io/projected/4958099b-fe10-4fd4-abaa-00d1520eda93-kube-api-access-gh8mn\") pod \"4958099b-fe10-4fd4-abaa-00d1520eda93\" (UID: \"4958099b-fe10-4fd4-abaa-00d1520eda93\") " Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.527718 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4958099b-fe10-4fd4-abaa-00d1520eda93-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4958099b-fe10-4fd4-abaa-00d1520eda93" (UID: "4958099b-fe10-4fd4-abaa-00d1520eda93"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.531281 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4958099b-fe10-4fd4-abaa-00d1520eda93-kube-api-access-gh8mn" (OuterVolumeSpecName: "kube-api-access-gh8mn") pod "4958099b-fe10-4fd4-abaa-00d1520eda93" (UID: "4958099b-fe10-4fd4-abaa-00d1520eda93"). InnerVolumeSpecName "kube-api-access-gh8mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.629243 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/60cf473e-4a59-468f-9cf6-963b24f7d34b-var-log-ovn\") pod \"60cf473e-4a59-468f-9cf6-963b24f7d34b\" (UID: \"60cf473e-4a59-468f-9cf6-963b24f7d34b\") " Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.629330 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djng7\" (UniqueName: \"kubernetes.io/projected/60cf473e-4a59-468f-9cf6-963b24f7d34b-kube-api-access-djng7\") pod \"60cf473e-4a59-468f-9cf6-963b24f7d34b\" (UID: \"60cf473e-4a59-468f-9cf6-963b24f7d34b\") " Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.629393 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40074d0d-335c-4da0-900b-43a3cd3ea091-catalog-content\") pod \"40074d0d-335c-4da0-900b-43a3cd3ea091\" (UID: \"40074d0d-335c-4da0-900b-43a3cd3ea091\") " Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.629494 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/60cf473e-4a59-468f-9cf6-963b24f7d34b-additional-scripts\") pod \"60cf473e-4a59-468f-9cf6-963b24f7d34b\" (UID: \"60cf473e-4a59-468f-9cf6-963b24f7d34b\") " Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.629530 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60cf473e-4a59-468f-9cf6-963b24f7d34b-scripts\") pod \"60cf473e-4a59-468f-9cf6-963b24f7d34b\" (UID: \"60cf473e-4a59-468f-9cf6-963b24f7d34b\") " Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.629578 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/60cf473e-4a59-468f-9cf6-963b24f7d34b-var-run\") pod \"60cf473e-4a59-468f-9cf6-963b24f7d34b\" (UID: \"60cf473e-4a59-468f-9cf6-963b24f7d34b\") " Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.629636 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40074d0d-335c-4da0-900b-43a3cd3ea091-utilities\") pod \"40074d0d-335c-4da0-900b-43a3cd3ea091\" (UID: \"40074d0d-335c-4da0-900b-43a3cd3ea091\") " Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.629688 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9knnc\" (UniqueName: \"kubernetes.io/projected/40074d0d-335c-4da0-900b-43a3cd3ea091-kube-api-access-9knnc\") pod \"40074d0d-335c-4da0-900b-43a3cd3ea091\" (UID: \"40074d0d-335c-4da0-900b-43a3cd3ea091\") " Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.629815 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/60cf473e-4a59-468f-9cf6-963b24f7d34b-var-run-ovn\") pod \"60cf473e-4a59-468f-9cf6-963b24f7d34b\" (UID: \"60cf473e-4a59-468f-9cf6-963b24f7d34b\") " Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.630265 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4958099b-fe10-4fd4-abaa-00d1520eda93-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.630294 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh8mn\" (UniqueName: \"kubernetes.io/projected/4958099b-fe10-4fd4-abaa-00d1520eda93-kube-api-access-gh8mn\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.630353 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60cf473e-4a59-468f-9cf6-963b24f7d34b-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "60cf473e-4a59-468f-9cf6-963b24f7d34b" (UID: "60cf473e-4a59-468f-9cf6-963b24f7d34b"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.630395 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60cf473e-4a59-468f-9cf6-963b24f7d34b-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "60cf473e-4a59-468f-9cf6-963b24f7d34b" (UID: "60cf473e-4a59-468f-9cf6-963b24f7d34b"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.631332 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/60cf473e-4a59-468f-9cf6-963b24f7d34b-var-run" (OuterVolumeSpecName: "var-run") pod "60cf473e-4a59-468f-9cf6-963b24f7d34b" (UID: "60cf473e-4a59-468f-9cf6-963b24f7d34b"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.632167 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60cf473e-4a59-468f-9cf6-963b24f7d34b-scripts" (OuterVolumeSpecName: "scripts") pod "60cf473e-4a59-468f-9cf6-963b24f7d34b" (UID: "60cf473e-4a59-468f-9cf6-963b24f7d34b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.632460 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60cf473e-4a59-468f-9cf6-963b24f7d34b-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "60cf473e-4a59-468f-9cf6-963b24f7d34b" (UID: "60cf473e-4a59-468f-9cf6-963b24f7d34b"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.632466 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40074d0d-335c-4da0-900b-43a3cd3ea091-utilities" (OuterVolumeSpecName: "utilities") pod "40074d0d-335c-4da0-900b-43a3cd3ea091" (UID: "40074d0d-335c-4da0-900b-43a3cd3ea091"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.634615 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40074d0d-335c-4da0-900b-43a3cd3ea091-kube-api-access-9knnc" (OuterVolumeSpecName: "kube-api-access-9knnc") pod "40074d0d-335c-4da0-900b-43a3cd3ea091" (UID: "40074d0d-335c-4da0-900b-43a3cd3ea091"). InnerVolumeSpecName "kube-api-access-9knnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.635038 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60cf473e-4a59-468f-9cf6-963b24f7d34b-kube-api-access-djng7" (OuterVolumeSpecName: "kube-api-access-djng7") pod "60cf473e-4a59-468f-9cf6-963b24f7d34b" (UID: "60cf473e-4a59-468f-9cf6-963b24f7d34b"). InnerVolumeSpecName "kube-api-access-djng7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.681734 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40074d0d-335c-4da0-900b-43a3cd3ea091-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40074d0d-335c-4da0-900b-43a3cd3ea091" (UID: "40074d0d-335c-4da0-900b-43a3cd3ea091"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.731610 4848 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/60cf473e-4a59-468f-9cf6-963b24f7d34b-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.731645 4848 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/60cf473e-4a59-468f-9cf6-963b24f7d34b-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.731657 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djng7\" (UniqueName: \"kubernetes.io/projected/60cf473e-4a59-468f-9cf6-963b24f7d34b-kube-api-access-djng7\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.731677 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40074d0d-335c-4da0-900b-43a3cd3ea091-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.731690 4848 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/60cf473e-4a59-468f-9cf6-963b24f7d34b-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.731702 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60cf473e-4a59-468f-9cf6-963b24f7d34b-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.731712 4848 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/60cf473e-4a59-468f-9cf6-963b24f7d34b-var-run\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.731723 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40074d0d-335c-4da0-900b-43a3cd3ea091-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.731733 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9knnc\" (UniqueName: \"kubernetes.io/projected/40074d0d-335c-4da0-900b-43a3cd3ea091-kube-api-access-9knnc\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.925631 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-884bt" event={"ID":"4958099b-fe10-4fd4-abaa-00d1520eda93","Type":"ContainerDied","Data":"54fe8d1f13a58d67a428904dbaf956af02b369690ce05198f1b6b41c96899196"} Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.925695 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54fe8d1f13a58d67a428904dbaf956af02b369690ce05198f1b6b41c96899196" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.925660 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-884bt" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.929564 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5bc15802-6def-48fe-8fd5-e6d85d068827","Type":"ContainerStarted","Data":"e97ea2ed528a75a0fb94c0c18f1ffb7493e262ef6ecfb844611dae0537c02cce"} Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.934104 4848 generic.go:334] "Generic (PLEG): container finished" podID="40074d0d-335c-4da0-900b-43a3cd3ea091" containerID="3100b40b5089ed93bc96bacf9fb91c9bec3e59a77feca6893dfd3e063040c4e9" exitCode=0 Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.934166 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5g8g" event={"ID":"40074d0d-335c-4da0-900b-43a3cd3ea091","Type":"ContainerDied","Data":"3100b40b5089ed93bc96bacf9fb91c9bec3e59a77feca6893dfd3e063040c4e9"} Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.934189 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h5g8g" event={"ID":"40074d0d-335c-4da0-900b-43a3cd3ea091","Type":"ContainerDied","Data":"a67f0185733c50f6a449f0811693601e5a3f3f7190be93253d9f5aeb1741af24"} Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.934211 4848 scope.go:117] "RemoveContainer" containerID="3100b40b5089ed93bc96bacf9fb91c9bec3e59a77feca6893dfd3e063040c4e9" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.934331 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h5g8g" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.937076 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c695f-config-xtpkb" event={"ID":"60cf473e-4a59-468f-9cf6-963b24f7d34b","Type":"ContainerDied","Data":"4241089d3dee6805083a2d3bf1031d72ca65c65e2b0948bde505b3e25acf3387"} Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.937119 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4241089d3dee6805083a2d3bf1031d72ca65c65e2b0948bde505b3e25acf3387" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.937132 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c695f-config-xtpkb" Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.986733 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h5g8g"] Feb 17 09:20:51 crc kubenswrapper[4848]: I0217 09:20:51.994132 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h5g8g"] Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.090208 4848 scope.go:117] "RemoveContainer" containerID="734a00d54f8585b8ab0770e94926ebf3696dda09f2f74442bf8dffed9904de51" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.129628 4848 scope.go:117] "RemoveContainer" containerID="f9dc46de446ec6286cd38b030b0a109b35f1c9a00b79ec87222850075e84bca7" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.147022 4848 scope.go:117] "RemoveContainer" containerID="3100b40b5089ed93bc96bacf9fb91c9bec3e59a77feca6893dfd3e063040c4e9" Feb 17 09:20:52 crc kubenswrapper[4848]: E0217 09:20:52.147498 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3100b40b5089ed93bc96bacf9fb91c9bec3e59a77feca6893dfd3e063040c4e9\": container with ID starting with 3100b40b5089ed93bc96bacf9fb91c9bec3e59a77feca6893dfd3e063040c4e9 not found: ID does not exist" containerID="3100b40b5089ed93bc96bacf9fb91c9bec3e59a77feca6893dfd3e063040c4e9" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.147534 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3100b40b5089ed93bc96bacf9fb91c9bec3e59a77feca6893dfd3e063040c4e9"} err="failed to get container status \"3100b40b5089ed93bc96bacf9fb91c9bec3e59a77feca6893dfd3e063040c4e9\": rpc error: code = NotFound desc = could not find container \"3100b40b5089ed93bc96bacf9fb91c9bec3e59a77feca6893dfd3e063040c4e9\": container with ID starting with 3100b40b5089ed93bc96bacf9fb91c9bec3e59a77feca6893dfd3e063040c4e9 not found: ID does not exist" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.147557 4848 scope.go:117] "RemoveContainer" containerID="734a00d54f8585b8ab0770e94926ebf3696dda09f2f74442bf8dffed9904de51" Feb 17 09:20:52 crc kubenswrapper[4848]: E0217 09:20:52.147782 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"734a00d54f8585b8ab0770e94926ebf3696dda09f2f74442bf8dffed9904de51\": container with ID starting with 734a00d54f8585b8ab0770e94926ebf3696dda09f2f74442bf8dffed9904de51 not found: ID does not exist" containerID="734a00d54f8585b8ab0770e94926ebf3696dda09f2f74442bf8dffed9904de51" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.147809 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"734a00d54f8585b8ab0770e94926ebf3696dda09f2f74442bf8dffed9904de51"} err="failed to get container status \"734a00d54f8585b8ab0770e94926ebf3696dda09f2f74442bf8dffed9904de51\": rpc error: code = NotFound desc = could not find container \"734a00d54f8585b8ab0770e94926ebf3696dda09f2f74442bf8dffed9904de51\": container with ID starting with 734a00d54f8585b8ab0770e94926ebf3696dda09f2f74442bf8dffed9904de51 not found: ID does not exist" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.147822 4848 scope.go:117] "RemoveContainer" containerID="f9dc46de446ec6286cd38b030b0a109b35f1c9a00b79ec87222850075e84bca7" Feb 17 09:20:52 crc kubenswrapper[4848]: E0217 09:20:52.148142 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9dc46de446ec6286cd38b030b0a109b35f1c9a00b79ec87222850075e84bca7\": container with ID starting with f9dc46de446ec6286cd38b030b0a109b35f1c9a00b79ec87222850075e84bca7 not found: ID does not exist" containerID="f9dc46de446ec6286cd38b030b0a109b35f1c9a00b79ec87222850075e84bca7" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.148190 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9dc46de446ec6286cd38b030b0a109b35f1c9a00b79ec87222850075e84bca7"} err="failed to get container status \"f9dc46de446ec6286cd38b030b0a109b35f1c9a00b79ec87222850075e84bca7\": rpc error: code = NotFound desc = could not find container \"f9dc46de446ec6286cd38b030b0a109b35f1c9a00b79ec87222850075e84bca7\": container with ID starting with f9dc46de446ec6286cd38b030b0a109b35f1c9a00b79ec87222850075e84bca7 not found: ID does not exist" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.603846 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-c695f-config-xtpkb"] Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.609515 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-c695f-config-xtpkb"] Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.729699 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-c695f-config-bxvr8"] Feb 17 09:20:52 crc kubenswrapper[4848]: E0217 09:20:52.730387 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a049c1c-b425-44cc-bde0-2e83be29d1a1" containerName="swift-ring-rebalance" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.730411 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a049c1c-b425-44cc-bde0-2e83be29d1a1" containerName="swift-ring-rebalance" Feb 17 09:20:52 crc kubenswrapper[4848]: E0217 09:20:52.730428 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40074d0d-335c-4da0-900b-43a3cd3ea091" containerName="registry-server" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.730436 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="40074d0d-335c-4da0-900b-43a3cd3ea091" containerName="registry-server" Feb 17 09:20:52 crc kubenswrapper[4848]: E0217 09:20:52.730448 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4958099b-fe10-4fd4-abaa-00d1520eda93" containerName="mariadb-account-create-update" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.730454 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="4958099b-fe10-4fd4-abaa-00d1520eda93" containerName="mariadb-account-create-update" Feb 17 09:20:52 crc kubenswrapper[4848]: E0217 09:20:52.730473 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40074d0d-335c-4da0-900b-43a3cd3ea091" containerName="extract-utilities" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.730479 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="40074d0d-335c-4da0-900b-43a3cd3ea091" containerName="extract-utilities" Feb 17 09:20:52 crc kubenswrapper[4848]: E0217 09:20:52.730486 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40074d0d-335c-4da0-900b-43a3cd3ea091" containerName="extract-content" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.730491 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="40074d0d-335c-4da0-900b-43a3cd3ea091" containerName="extract-content" Feb 17 09:20:52 crc kubenswrapper[4848]: E0217 09:20:52.730503 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60cf473e-4a59-468f-9cf6-963b24f7d34b" containerName="ovn-config" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.730509 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="60cf473e-4a59-468f-9cf6-963b24f7d34b" containerName="ovn-config" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.730664 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="60cf473e-4a59-468f-9cf6-963b24f7d34b" containerName="ovn-config" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.730688 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a049c1c-b425-44cc-bde0-2e83be29d1a1" containerName="swift-ring-rebalance" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.730697 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="4958099b-fe10-4fd4-abaa-00d1520eda93" containerName="mariadb-account-create-update" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.730709 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="40074d0d-335c-4da0-900b-43a3cd3ea091" containerName="registry-server" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.731385 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-c695f" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.731472 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c695f-config-bxvr8" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.741038 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.750985 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-c695f-config-bxvr8"] Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.856632 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-scripts\") pod \"ovn-controller-c695f-config-bxvr8\" (UID: \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\") " pod="openstack/ovn-controller-c695f-config-bxvr8" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.856684 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-additional-scripts\") pod \"ovn-controller-c695f-config-bxvr8\" (UID: \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\") " pod="openstack/ovn-controller-c695f-config-bxvr8" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.856707 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mrv8\" (UniqueName: \"kubernetes.io/projected/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-kube-api-access-7mrv8\") pod \"ovn-controller-c695f-config-bxvr8\" (UID: \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\") " pod="openstack/ovn-controller-c695f-config-bxvr8" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.856826 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-var-run-ovn\") pod \"ovn-controller-c695f-config-bxvr8\" (UID: \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\") " pod="openstack/ovn-controller-c695f-config-bxvr8" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.856846 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-var-log-ovn\") pod \"ovn-controller-c695f-config-bxvr8\" (UID: \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\") " pod="openstack/ovn-controller-c695f-config-bxvr8" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.856870 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-var-run\") pod \"ovn-controller-c695f-config-bxvr8\" (UID: \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\") " pod="openstack/ovn-controller-c695f-config-bxvr8" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.946062 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5bc15802-6def-48fe-8fd5-e6d85d068827","Type":"ContainerStarted","Data":"048bca9e9134f4e42c25eaf66dc2ad4e7e9766be6c0776ef46c641340f1500f9"} Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.946102 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5bc15802-6def-48fe-8fd5-e6d85d068827","Type":"ContainerStarted","Data":"5749f4adb3390ccc8c0e2988735bfea360027f71523fdee5abe1ca05b05096f0"} Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.946112 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5bc15802-6def-48fe-8fd5-e6d85d068827","Type":"ContainerStarted","Data":"9822101845b9af35379d6d12c39f6543639eeb4a9911ad175d16eb2024ecfee7"} Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.946121 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5bc15802-6def-48fe-8fd5-e6d85d068827","Type":"ContainerStarted","Data":"60503f9fa3b2f892bc03e6b6181f68ea0c2a84943f286f737d046c03a0dfacc3"} Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.957827 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-var-run-ovn\") pod \"ovn-controller-c695f-config-bxvr8\" (UID: \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\") " pod="openstack/ovn-controller-c695f-config-bxvr8" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.957868 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-var-log-ovn\") pod \"ovn-controller-c695f-config-bxvr8\" (UID: \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\") " pod="openstack/ovn-controller-c695f-config-bxvr8" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.957894 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-var-run\") pod \"ovn-controller-c695f-config-bxvr8\" (UID: \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\") " pod="openstack/ovn-controller-c695f-config-bxvr8" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.957962 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-scripts\") pod \"ovn-controller-c695f-config-bxvr8\" (UID: \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\") " pod="openstack/ovn-controller-c695f-config-bxvr8" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.957984 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-additional-scripts\") pod \"ovn-controller-c695f-config-bxvr8\" (UID: \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\") " pod="openstack/ovn-controller-c695f-config-bxvr8" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.958001 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mrv8\" (UniqueName: \"kubernetes.io/projected/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-kube-api-access-7mrv8\") pod \"ovn-controller-c695f-config-bxvr8\" (UID: \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\") " pod="openstack/ovn-controller-c695f-config-bxvr8" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.958135 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-var-run-ovn\") pod \"ovn-controller-c695f-config-bxvr8\" (UID: \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\") " pod="openstack/ovn-controller-c695f-config-bxvr8" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.958211 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-var-run\") pod \"ovn-controller-c695f-config-bxvr8\" (UID: \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\") " pod="openstack/ovn-controller-c695f-config-bxvr8" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.958888 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-additional-scripts\") pod \"ovn-controller-c695f-config-bxvr8\" (UID: \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\") " pod="openstack/ovn-controller-c695f-config-bxvr8" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.959023 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-var-log-ovn\") pod \"ovn-controller-c695f-config-bxvr8\" (UID: \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\") " pod="openstack/ovn-controller-c695f-config-bxvr8" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.959828 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-scripts\") pod \"ovn-controller-c695f-config-bxvr8\" (UID: \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\") " pod="openstack/ovn-controller-c695f-config-bxvr8" Feb 17 09:20:52 crc kubenswrapper[4848]: I0217 09:20:52.975231 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mrv8\" (UniqueName: \"kubernetes.io/projected/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-kube-api-access-7mrv8\") pod \"ovn-controller-c695f-config-bxvr8\" (UID: \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\") " pod="openstack/ovn-controller-c695f-config-bxvr8" Feb 17 09:20:53 crc kubenswrapper[4848]: I0217 09:20:53.081531 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c695f-config-bxvr8" Feb 17 09:20:53 crc kubenswrapper[4848]: I0217 09:20:53.412297 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40074d0d-335c-4da0-900b-43a3cd3ea091" path="/var/lib/kubelet/pods/40074d0d-335c-4da0-900b-43a3cd3ea091/volumes" Feb 17 09:20:53 crc kubenswrapper[4848]: I0217 09:20:53.413573 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60cf473e-4a59-468f-9cf6-963b24f7d34b" path="/var/lib/kubelet/pods/60cf473e-4a59-468f-9cf6-963b24f7d34b/volumes" Feb 17 09:20:53 crc kubenswrapper[4848]: I0217 09:20:53.574150 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-c695f-config-bxvr8"] Feb 17 09:20:53 crc kubenswrapper[4848]: W0217 09:20:53.585122 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13747b05_25cf_4d6d_a9aa_79735c5ac3d9.slice/crio-ec23e158aa489e21d58ca2ff4639a7e9a22ce2ddb7731df538b5c333a652aa76 WatchSource:0}: Error finding container ec23e158aa489e21d58ca2ff4639a7e9a22ce2ddb7731df538b5c333a652aa76: Status 404 returned error can't find the container with id ec23e158aa489e21d58ca2ff4639a7e9a22ce2ddb7731df538b5c333a652aa76 Feb 17 09:20:53 crc kubenswrapper[4848]: I0217 09:20:53.956443 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c695f-config-bxvr8" event={"ID":"13747b05-25cf-4d6d-a9aa-79735c5ac3d9","Type":"ContainerStarted","Data":"d14973f2fd21fcd48a589b585106fc57fe7b224455c3b47a22d9cdc4555c7755"} Feb 17 09:20:53 crc kubenswrapper[4848]: I0217 09:20:53.956702 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c695f-config-bxvr8" event={"ID":"13747b05-25cf-4d6d-a9aa-79735c5ac3d9","Type":"ContainerStarted","Data":"ec23e158aa489e21d58ca2ff4639a7e9a22ce2ddb7731df538b5c333a652aa76"} Feb 17 09:20:53 crc kubenswrapper[4848]: I0217 09:20:53.973830 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-c695f-config-bxvr8" podStartSLOduration=1.973813599 podStartE2EDuration="1.973813599s" podCreationTimestamp="2026-02-17 09:20:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:20:53.970081751 +0000 UTC m=+931.513337397" watchObservedRunningTime="2026-02-17 09:20:53.973813599 +0000 UTC m=+931.517069245" Feb 17 09:20:54 crc kubenswrapper[4848]: I0217 09:20:54.304752 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8rrkh"] Feb 17 09:20:54 crc kubenswrapper[4848]: I0217 09:20:54.306567 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8rrkh" Feb 17 09:20:54 crc kubenswrapper[4848]: I0217 09:20:54.329936 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8rrkh"] Feb 17 09:20:54 crc kubenswrapper[4848]: I0217 09:20:54.385526 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6791ad5-1ad5-45f3-a8a4-f16b14013bba-utilities\") pod \"redhat-operators-8rrkh\" (UID: \"d6791ad5-1ad5-45f3-a8a4-f16b14013bba\") " pod="openshift-marketplace/redhat-operators-8rrkh" Feb 17 09:20:54 crc kubenswrapper[4848]: I0217 09:20:54.385598 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6791ad5-1ad5-45f3-a8a4-f16b14013bba-catalog-content\") pod \"redhat-operators-8rrkh\" (UID: \"d6791ad5-1ad5-45f3-a8a4-f16b14013bba\") " pod="openshift-marketplace/redhat-operators-8rrkh" Feb 17 09:20:54 crc kubenswrapper[4848]: I0217 09:20:54.385627 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx226\" (UniqueName: \"kubernetes.io/projected/d6791ad5-1ad5-45f3-a8a4-f16b14013bba-kube-api-access-qx226\") pod \"redhat-operators-8rrkh\" (UID: \"d6791ad5-1ad5-45f3-a8a4-f16b14013bba\") " pod="openshift-marketplace/redhat-operators-8rrkh" Feb 17 09:20:54 crc kubenswrapper[4848]: I0217 09:20:54.486706 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6791ad5-1ad5-45f3-a8a4-f16b14013bba-utilities\") pod \"redhat-operators-8rrkh\" (UID: \"d6791ad5-1ad5-45f3-a8a4-f16b14013bba\") " pod="openshift-marketplace/redhat-operators-8rrkh" Feb 17 09:20:54 crc kubenswrapper[4848]: I0217 09:20:54.486789 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6791ad5-1ad5-45f3-a8a4-f16b14013bba-catalog-content\") pod \"redhat-operators-8rrkh\" (UID: \"d6791ad5-1ad5-45f3-a8a4-f16b14013bba\") " pod="openshift-marketplace/redhat-operators-8rrkh" Feb 17 09:20:54 crc kubenswrapper[4848]: I0217 09:20:54.486805 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx226\" (UniqueName: \"kubernetes.io/projected/d6791ad5-1ad5-45f3-a8a4-f16b14013bba-kube-api-access-qx226\") pod \"redhat-operators-8rrkh\" (UID: \"d6791ad5-1ad5-45f3-a8a4-f16b14013bba\") " pod="openshift-marketplace/redhat-operators-8rrkh" Feb 17 09:20:54 crc kubenswrapper[4848]: I0217 09:20:54.487458 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6791ad5-1ad5-45f3-a8a4-f16b14013bba-utilities\") pod \"redhat-operators-8rrkh\" (UID: \"d6791ad5-1ad5-45f3-a8a4-f16b14013bba\") " pod="openshift-marketplace/redhat-operators-8rrkh" Feb 17 09:20:54 crc kubenswrapper[4848]: I0217 09:20:54.487682 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6791ad5-1ad5-45f3-a8a4-f16b14013bba-catalog-content\") pod \"redhat-operators-8rrkh\" (UID: \"d6791ad5-1ad5-45f3-a8a4-f16b14013bba\") " pod="openshift-marketplace/redhat-operators-8rrkh" Feb 17 09:20:54 crc kubenswrapper[4848]: I0217 09:20:54.505407 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx226\" (UniqueName: \"kubernetes.io/projected/d6791ad5-1ad5-45f3-a8a4-f16b14013bba-kube-api-access-qx226\") pod \"redhat-operators-8rrkh\" (UID: \"d6791ad5-1ad5-45f3-a8a4-f16b14013bba\") " pod="openshift-marketplace/redhat-operators-8rrkh" Feb 17 09:20:54 crc kubenswrapper[4848]: I0217 09:20:54.619959 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8rrkh" Feb 17 09:20:54 crc kubenswrapper[4848]: I0217 09:20:54.967413 4848 generic.go:334] "Generic (PLEG): container finished" podID="13747b05-25cf-4d6d-a9aa-79735c5ac3d9" containerID="d14973f2fd21fcd48a589b585106fc57fe7b224455c3b47a22d9cdc4555c7755" exitCode=0 Feb 17 09:20:54 crc kubenswrapper[4848]: I0217 09:20:54.968042 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-c695f-config-bxvr8" event={"ID":"13747b05-25cf-4d6d-a9aa-79735c5ac3d9","Type":"ContainerDied","Data":"d14973f2fd21fcd48a589b585106fc57fe7b224455c3b47a22d9cdc4555c7755"} Feb 17 09:20:54 crc kubenswrapper[4848]: I0217 09:20:54.995601 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5bc15802-6def-48fe-8fd5-e6d85d068827","Type":"ContainerStarted","Data":"636f1ef8722e1313dc079e20b01d99811793ba2e496980e2c8033c10ed1a4e95"} Feb 17 09:20:54 crc kubenswrapper[4848]: I0217 09:20:54.995644 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5bc15802-6def-48fe-8fd5-e6d85d068827","Type":"ContainerStarted","Data":"59d537c85c51fe7f71b40903e05675a7dd0dfc108ba57470c8196c217a46fedc"} Feb 17 09:20:54 crc kubenswrapper[4848]: I0217 09:20:54.995656 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5bc15802-6def-48fe-8fd5-e6d85d068827","Type":"ContainerStarted","Data":"aa220ee6c30f15318f4150db71f103117722ffca40287fbebabff52a8c0bd000"} Feb 17 09:20:55 crc kubenswrapper[4848]: I0217 09:20:55.111127 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8rrkh"] Feb 17 09:20:55 crc kubenswrapper[4848]: W0217 09:20:55.117838 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6791ad5_1ad5_45f3_a8a4_f16b14013bba.slice/crio-349ea0aa5945675d9b472fec86cfb7ac9a5a63efb3759a8c0a3bdfb86697a80b WatchSource:0}: Error finding container 349ea0aa5945675d9b472fec86cfb7ac9a5a63efb3759a8c0a3bdfb86697a80b: Status 404 returned error can't find the container with id 349ea0aa5945675d9b472fec86cfb7ac9a5a63efb3759a8c0a3bdfb86697a80b Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.005922 4848 generic.go:334] "Generic (PLEG): container finished" podID="d6791ad5-1ad5-45f3-a8a4-f16b14013bba" containerID="6aed5c03cf91c0b35b7e8a6187368b5bee175fc3f6f99660ee5b0eb62da65973" exitCode=0 Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.006017 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8rrkh" event={"ID":"d6791ad5-1ad5-45f3-a8a4-f16b14013bba","Type":"ContainerDied","Data":"6aed5c03cf91c0b35b7e8a6187368b5bee175fc3f6f99660ee5b0eb62da65973"} Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.006296 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8rrkh" event={"ID":"d6791ad5-1ad5-45f3-a8a4-f16b14013bba","Type":"ContainerStarted","Data":"349ea0aa5945675d9b472fec86cfb7ac9a5a63efb3759a8c0a3bdfb86697a80b"} Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.015106 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5bc15802-6def-48fe-8fd5-e6d85d068827","Type":"ContainerStarted","Data":"4c148c5c83b1c6c6192e5a9a339fac62e4ae2ba1535bcc32efb402807ab8f875"} Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.015143 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5bc15802-6def-48fe-8fd5-e6d85d068827","Type":"ContainerStarted","Data":"cfae93be9592314b3fd93807a27ca4fe2862c623423152c8ee62df3253f2d49c"} Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.015153 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5bc15802-6def-48fe-8fd5-e6d85d068827","Type":"ContainerStarted","Data":"90ee31a5c705955300de47c399fd489df588e5e1c4ddc07e84ed1fa5635a9723"} Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.015162 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"5bc15802-6def-48fe-8fd5-e6d85d068827","Type":"ContainerStarted","Data":"65e8a8a7b359ec3baf328ab39f30e610a3f9e90847cd9e4b602912cb572d4ae8"} Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.064979 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=26.143528279 podStartE2EDuration="31.064960955s" podCreationTimestamp="2026-02-17 09:20:25 +0000 UTC" firstStartedPulling="2026-02-17 09:20:49.197154868 +0000 UTC m=+926.740410504" lastFinishedPulling="2026-02-17 09:20:54.118587534 +0000 UTC m=+931.661843180" observedRunningTime="2026-02-17 09:20:56.055191441 +0000 UTC m=+933.598447087" watchObservedRunningTime="2026-02-17 09:20:56.064960955 +0000 UTC m=+933.608216591" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.383378 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-778944759-7769n"] Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.385418 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-778944759-7769n" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.387061 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.394136 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-778944759-7769n"] Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.430261 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c695f-config-bxvr8" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.518595 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-scripts\") pod \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\" (UID: \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\") " Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.518636 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-var-run-ovn\") pod \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\" (UID: \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\") " Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.518679 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-additional-scripts\") pod \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\" (UID: \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\") " Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.518756 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "13747b05-25cf-4d6d-a9aa-79735c5ac3d9" (UID: "13747b05-25cf-4d6d-a9aa-79735c5ac3d9"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.518796 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-var-run\") pod \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\" (UID: \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\") " Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.518817 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-var-log-ovn\") pod \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\" (UID: \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\") " Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.518857 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mrv8\" (UniqueName: \"kubernetes.io/projected/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-kube-api-access-7mrv8\") pod \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\" (UID: \"13747b05-25cf-4d6d-a9aa-79735c5ac3d9\") " Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.518906 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-var-run" (OuterVolumeSpecName: "var-run") pod "13747b05-25cf-4d6d-a9aa-79735c5ac3d9" (UID: "13747b05-25cf-4d6d-a9aa-79735c5ac3d9"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.518933 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "13747b05-25cf-4d6d-a9aa-79735c5ac3d9" (UID: "13747b05-25cf-4d6d-a9aa-79735c5ac3d9"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.519048 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-config\") pod \"dnsmasq-dns-778944759-7769n\" (UID: \"aaf65cec-c416-49a5-87db-11d9713a96ed\") " pod="openstack/dnsmasq-dns-778944759-7769n" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.519088 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-dns-svc\") pod \"dnsmasq-dns-778944759-7769n\" (UID: \"aaf65cec-c416-49a5-87db-11d9713a96ed\") " pod="openstack/dnsmasq-dns-778944759-7769n" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.519266 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "13747b05-25cf-4d6d-a9aa-79735c5ac3d9" (UID: "13747b05-25cf-4d6d-a9aa-79735c5ac3d9"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.519288 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-ovsdbserver-nb\") pod \"dnsmasq-dns-778944759-7769n\" (UID: \"aaf65cec-c416-49a5-87db-11d9713a96ed\") " pod="openstack/dnsmasq-dns-778944759-7769n" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.519401 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6xqm\" (UniqueName: \"kubernetes.io/projected/aaf65cec-c416-49a5-87db-11d9713a96ed-kube-api-access-f6xqm\") pod \"dnsmasq-dns-778944759-7769n\" (UID: \"aaf65cec-c416-49a5-87db-11d9713a96ed\") " pod="openstack/dnsmasq-dns-778944759-7769n" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.519526 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-dns-swift-storage-0\") pod \"dnsmasq-dns-778944759-7769n\" (UID: \"aaf65cec-c416-49a5-87db-11d9713a96ed\") " pod="openstack/dnsmasq-dns-778944759-7769n" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.519637 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-ovsdbserver-sb\") pod \"dnsmasq-dns-778944759-7769n\" (UID: \"aaf65cec-c416-49a5-87db-11d9713a96ed\") " pod="openstack/dnsmasq-dns-778944759-7769n" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.519986 4848 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-var-run\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.520022 4848 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.520036 4848 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.520050 4848 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.521113 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-scripts" (OuterVolumeSpecName: "scripts") pod "13747b05-25cf-4d6d-a9aa-79735c5ac3d9" (UID: "13747b05-25cf-4d6d-a9aa-79735c5ac3d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.524902 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-kube-api-access-7mrv8" (OuterVolumeSpecName: "kube-api-access-7mrv8") pod "13747b05-25cf-4d6d-a9aa-79735c5ac3d9" (UID: "13747b05-25cf-4d6d-a9aa-79735c5ac3d9"). InnerVolumeSpecName "kube-api-access-7mrv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.623985 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-ovsdbserver-nb\") pod \"dnsmasq-dns-778944759-7769n\" (UID: \"aaf65cec-c416-49a5-87db-11d9713a96ed\") " pod="openstack/dnsmasq-dns-778944759-7769n" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.624402 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6xqm\" (UniqueName: \"kubernetes.io/projected/aaf65cec-c416-49a5-87db-11d9713a96ed-kube-api-access-f6xqm\") pod \"dnsmasq-dns-778944759-7769n\" (UID: \"aaf65cec-c416-49a5-87db-11d9713a96ed\") " pod="openstack/dnsmasq-dns-778944759-7769n" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.624424 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-dns-swift-storage-0\") pod \"dnsmasq-dns-778944759-7769n\" (UID: \"aaf65cec-c416-49a5-87db-11d9713a96ed\") " pod="openstack/dnsmasq-dns-778944759-7769n" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.624757 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-ovsdbserver-nb\") pod \"dnsmasq-dns-778944759-7769n\" (UID: \"aaf65cec-c416-49a5-87db-11d9713a96ed\") " pod="openstack/dnsmasq-dns-778944759-7769n" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.625194 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-dns-swift-storage-0\") pod \"dnsmasq-dns-778944759-7769n\" (UID: \"aaf65cec-c416-49a5-87db-11d9713a96ed\") " pod="openstack/dnsmasq-dns-778944759-7769n" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.625307 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-ovsdbserver-sb\") pod \"dnsmasq-dns-778944759-7769n\" (UID: \"aaf65cec-c416-49a5-87db-11d9713a96ed\") " pod="openstack/dnsmasq-dns-778944759-7769n" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.625360 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-ovsdbserver-sb\") pod \"dnsmasq-dns-778944759-7769n\" (UID: \"aaf65cec-c416-49a5-87db-11d9713a96ed\") " pod="openstack/dnsmasq-dns-778944759-7769n" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.625391 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-config\") pod \"dnsmasq-dns-778944759-7769n\" (UID: \"aaf65cec-c416-49a5-87db-11d9713a96ed\") " pod="openstack/dnsmasq-dns-778944759-7769n" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.625993 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-config\") pod \"dnsmasq-dns-778944759-7769n\" (UID: \"aaf65cec-c416-49a5-87db-11d9713a96ed\") " pod="openstack/dnsmasq-dns-778944759-7769n" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.626063 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-dns-svc\") pod \"dnsmasq-dns-778944759-7769n\" (UID: \"aaf65cec-c416-49a5-87db-11d9713a96ed\") " pod="openstack/dnsmasq-dns-778944759-7769n" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.626691 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-dns-svc\") pod \"dnsmasq-dns-778944759-7769n\" (UID: \"aaf65cec-c416-49a5-87db-11d9713a96ed\") " pod="openstack/dnsmasq-dns-778944759-7769n" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.627000 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mrv8\" (UniqueName: \"kubernetes.io/projected/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-kube-api-access-7mrv8\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.627025 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/13747b05-25cf-4d6d-a9aa-79735c5ac3d9-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.652566 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6xqm\" (UniqueName: \"kubernetes.io/projected/aaf65cec-c416-49a5-87db-11d9713a96ed-kube-api-access-f6xqm\") pod \"dnsmasq-dns-778944759-7769n\" (UID: \"aaf65cec-c416-49a5-87db-11d9713a96ed\") " pod="openstack/dnsmasq-dns-778944759-7769n" Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.653152 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-c695f-config-bxvr8"] Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.660614 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-c695f-config-bxvr8"] Feb 17 09:20:56 crc kubenswrapper[4848]: I0217 09:20:56.740596 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-778944759-7769n" Feb 17 09:20:57 crc kubenswrapper[4848]: I0217 09:20:57.026968 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ec23e158aa489e21d58ca2ff4639a7e9a22ce2ddb7731df538b5c333a652aa76" Feb 17 09:20:57 crc kubenswrapper[4848]: I0217 09:20:57.027127 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-c695f-config-bxvr8" Feb 17 09:20:57 crc kubenswrapper[4848]: I0217 09:20:57.029952 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8rrkh" event={"ID":"d6791ad5-1ad5-45f3-a8a4-f16b14013bba","Type":"ContainerStarted","Data":"243d3710349f12c956818269bffb7761786bd802a2c2ee74a61e49ae0249d049"} Feb 17 09:20:57 crc kubenswrapper[4848]: I0217 09:20:57.177684 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-778944759-7769n"] Feb 17 09:20:57 crc kubenswrapper[4848]: W0217 09:20:57.180025 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaaf65cec_c416_49a5_87db_11d9713a96ed.slice/crio-8767a74883d5247e94758f952c587007a2e404dbc53034e8090b54ccfba98376 WatchSource:0}: Error finding container 8767a74883d5247e94758f952c587007a2e404dbc53034e8090b54ccfba98376: Status 404 returned error can't find the container with id 8767a74883d5247e94758f952c587007a2e404dbc53034e8090b54ccfba98376 Feb 17 09:20:57 crc kubenswrapper[4848]: I0217 09:20:57.396988 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13747b05-25cf-4d6d-a9aa-79735c5ac3d9" path="/var/lib/kubelet/pods/13747b05-25cf-4d6d-a9aa-79735c5ac3d9/volumes" Feb 17 09:20:58 crc kubenswrapper[4848]: I0217 09:20:58.048217 4848 generic.go:334] "Generic (PLEG): container finished" podID="d6791ad5-1ad5-45f3-a8a4-f16b14013bba" containerID="243d3710349f12c956818269bffb7761786bd802a2c2ee74a61e49ae0249d049" exitCode=0 Feb 17 09:20:58 crc kubenswrapper[4848]: I0217 09:20:58.048301 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8rrkh" event={"ID":"d6791ad5-1ad5-45f3-a8a4-f16b14013bba","Type":"ContainerDied","Data":"243d3710349f12c956818269bffb7761786bd802a2c2ee74a61e49ae0249d049"} Feb 17 09:20:58 crc kubenswrapper[4848]: I0217 09:20:58.054494 4848 generic.go:334] "Generic (PLEG): container finished" podID="ed29dc41-db30-4792-8518-4ef61f232734" containerID="252f99d92bd4bf0bb1e005048fd69d1f82f58219b2756de4285dcada4bdb9adc" exitCode=0 Feb 17 09:20:58 crc kubenswrapper[4848]: I0217 09:20:58.054598 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-5f5pj" event={"ID":"ed29dc41-db30-4792-8518-4ef61f232734","Type":"ContainerDied","Data":"252f99d92bd4bf0bb1e005048fd69d1f82f58219b2756de4285dcada4bdb9adc"} Feb 17 09:20:58 crc kubenswrapper[4848]: I0217 09:20:58.063941 4848 generic.go:334] "Generic (PLEG): container finished" podID="aaf65cec-c416-49a5-87db-11d9713a96ed" containerID="09e273efe8636d666c360a92b5db7aca79690867393055a4546982fbd8c16bb4" exitCode=0 Feb 17 09:20:58 crc kubenswrapper[4848]: I0217 09:20:58.063991 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778944759-7769n" event={"ID":"aaf65cec-c416-49a5-87db-11d9713a96ed","Type":"ContainerDied","Data":"09e273efe8636d666c360a92b5db7aca79690867393055a4546982fbd8c16bb4"} Feb 17 09:20:58 crc kubenswrapper[4848]: I0217 09:20:58.064020 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778944759-7769n" event={"ID":"aaf65cec-c416-49a5-87db-11d9713a96ed","Type":"ContainerStarted","Data":"8767a74883d5247e94758f952c587007a2e404dbc53034e8090b54ccfba98376"} Feb 17 09:20:59 crc kubenswrapper[4848]: I0217 09:20:59.072343 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8rrkh" event={"ID":"d6791ad5-1ad5-45f3-a8a4-f16b14013bba","Type":"ContainerStarted","Data":"77087ae084579cd3541e26a3460dff7cd3b52005fa04147daec0b02417f2ac80"} Feb 17 09:20:59 crc kubenswrapper[4848]: I0217 09:20:59.074168 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778944759-7769n" event={"ID":"aaf65cec-c416-49a5-87db-11d9713a96ed","Type":"ContainerStarted","Data":"85f89e1d610cfea829d07a20290dc1d9b0dc413d8d614e8c04a2ca0f16a3135d"} Feb 17 09:20:59 crc kubenswrapper[4848]: I0217 09:20:59.098124 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8rrkh" podStartSLOduration=2.402699064 podStartE2EDuration="5.098105829s" podCreationTimestamp="2026-02-17 09:20:54 +0000 UTC" firstStartedPulling="2026-02-17 09:20:56.009334369 +0000 UTC m=+933.552590015" lastFinishedPulling="2026-02-17 09:20:58.704741124 +0000 UTC m=+936.247996780" observedRunningTime="2026-02-17 09:20:59.094334749 +0000 UTC m=+936.637590405" watchObservedRunningTime="2026-02-17 09:20:59.098105829 +0000 UTC m=+936.641361475" Feb 17 09:20:59 crc kubenswrapper[4848]: I0217 09:20:59.117550 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-778944759-7769n" podStartSLOduration=3.117526733 podStartE2EDuration="3.117526733s" podCreationTimestamp="2026-02-17 09:20:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:20:59.112582699 +0000 UTC m=+936.655838355" watchObservedRunningTime="2026-02-17 09:20:59.117526733 +0000 UTC m=+936.660782379" Feb 17 09:20:59 crc kubenswrapper[4848]: I0217 09:20:59.227876 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="db50eaa9-ca0a-4a83-98d8-fce82f849d91" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.94:5671: connect: connection refused" Feb 17 09:20:59 crc kubenswrapper[4848]: I0217 09:20:59.506441 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-5f5pj" Feb 17 09:20:59 crc kubenswrapper[4848]: I0217 09:20:59.596863 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed29dc41-db30-4792-8518-4ef61f232734-combined-ca-bundle\") pod \"ed29dc41-db30-4792-8518-4ef61f232734\" (UID: \"ed29dc41-db30-4792-8518-4ef61f232734\") " Feb 17 09:20:59 crc kubenswrapper[4848]: I0217 09:20:59.596980 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wcfl\" (UniqueName: \"kubernetes.io/projected/ed29dc41-db30-4792-8518-4ef61f232734-kube-api-access-4wcfl\") pod \"ed29dc41-db30-4792-8518-4ef61f232734\" (UID: \"ed29dc41-db30-4792-8518-4ef61f232734\") " Feb 17 09:20:59 crc kubenswrapper[4848]: I0217 09:20:59.597101 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed29dc41-db30-4792-8518-4ef61f232734-config-data\") pod \"ed29dc41-db30-4792-8518-4ef61f232734\" (UID: \"ed29dc41-db30-4792-8518-4ef61f232734\") " Feb 17 09:20:59 crc kubenswrapper[4848]: I0217 09:20:59.597130 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed29dc41-db30-4792-8518-4ef61f232734-db-sync-config-data\") pod \"ed29dc41-db30-4792-8518-4ef61f232734\" (UID: \"ed29dc41-db30-4792-8518-4ef61f232734\") " Feb 17 09:20:59 crc kubenswrapper[4848]: I0217 09:20:59.604437 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed29dc41-db30-4792-8518-4ef61f232734-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ed29dc41-db30-4792-8518-4ef61f232734" (UID: "ed29dc41-db30-4792-8518-4ef61f232734"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:20:59 crc kubenswrapper[4848]: I0217 09:20:59.604839 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed29dc41-db30-4792-8518-4ef61f232734-kube-api-access-4wcfl" (OuterVolumeSpecName: "kube-api-access-4wcfl") pod "ed29dc41-db30-4792-8518-4ef61f232734" (UID: "ed29dc41-db30-4792-8518-4ef61f232734"). InnerVolumeSpecName "kube-api-access-4wcfl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:20:59 crc kubenswrapper[4848]: I0217 09:20:59.637462 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed29dc41-db30-4792-8518-4ef61f232734-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed29dc41-db30-4792-8518-4ef61f232734" (UID: "ed29dc41-db30-4792-8518-4ef61f232734"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:20:59 crc kubenswrapper[4848]: I0217 09:20:59.663214 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed29dc41-db30-4792-8518-4ef61f232734-config-data" (OuterVolumeSpecName: "config-data") pod "ed29dc41-db30-4792-8518-4ef61f232734" (UID: "ed29dc41-db30-4792-8518-4ef61f232734"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:20:59 crc kubenswrapper[4848]: I0217 09:20:59.699496 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed29dc41-db30-4792-8518-4ef61f232734-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:59 crc kubenswrapper[4848]: I0217 09:20:59.699550 4848 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ed29dc41-db30-4792-8518-4ef61f232734-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:59 crc kubenswrapper[4848]: I0217 09:20:59.699564 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed29dc41-db30-4792-8518-4ef61f232734-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:59 crc kubenswrapper[4848]: I0217 09:20:59.699576 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wcfl\" (UniqueName: \"kubernetes.io/projected/ed29dc41-db30-4792-8518-4ef61f232734-kube-api-access-4wcfl\") on node \"crc\" DevicePath \"\"" Feb 17 09:20:59 crc kubenswrapper[4848]: I0217 09:20:59.824962 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.084979 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-5f5pj" event={"ID":"ed29dc41-db30-4792-8518-4ef61f232734","Type":"ContainerDied","Data":"704e89191d3c83803ba77f28bfa1d9e3a8b4cf8141d5f2a801b258c12c3f15e0"} Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.085014 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-5f5pj" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.085039 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="704e89191d3c83803ba77f28bfa1d9e3a8b4cf8141d5f2a801b258c12c3f15e0" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.085704 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-778944759-7769n" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.583883 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-778944759-7769n"] Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.627111 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-96fb4d4c9-mr8wv"] Feb 17 09:21:00 crc kubenswrapper[4848]: E0217 09:21:00.628386 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed29dc41-db30-4792-8518-4ef61f232734" containerName="glance-db-sync" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.628407 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed29dc41-db30-4792-8518-4ef61f232734" containerName="glance-db-sync" Feb 17 09:21:00 crc kubenswrapper[4848]: E0217 09:21:00.628445 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13747b05-25cf-4d6d-a9aa-79735c5ac3d9" containerName="ovn-config" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.628453 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="13747b05-25cf-4d6d-a9aa-79735c5ac3d9" containerName="ovn-config" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.628660 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed29dc41-db30-4792-8518-4ef61f232734" containerName="glance-db-sync" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.628681 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="13747b05-25cf-4d6d-a9aa-79735c5ac3d9" containerName="ovn-config" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.629987 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.641952 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-96fb4d4c9-mr8wv"] Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.816597 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-ovsdbserver-sb\") pod \"dnsmasq-dns-96fb4d4c9-mr8wv\" (UID: \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\") " pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.816645 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-dns-svc\") pod \"dnsmasq-dns-96fb4d4c9-mr8wv\" (UID: \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\") " pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.816685 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-config\") pod \"dnsmasq-dns-96fb4d4c9-mr8wv\" (UID: \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\") " pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.816818 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-dns-swift-storage-0\") pod \"dnsmasq-dns-96fb4d4c9-mr8wv\" (UID: \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\") " pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.816865 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-ovsdbserver-nb\") pod \"dnsmasq-dns-96fb4d4c9-mr8wv\" (UID: \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\") " pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.816898 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm7c4\" (UniqueName: \"kubernetes.io/projected/bcafc0d6-9155-456a-b63d-b5c1944fb51c-kube-api-access-wm7c4\") pod \"dnsmasq-dns-96fb4d4c9-mr8wv\" (UID: \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\") " pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.918829 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-dns-swift-storage-0\") pod \"dnsmasq-dns-96fb4d4c9-mr8wv\" (UID: \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\") " pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.918889 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-ovsdbserver-nb\") pod \"dnsmasq-dns-96fb4d4c9-mr8wv\" (UID: \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\") " pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.918934 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm7c4\" (UniqueName: \"kubernetes.io/projected/bcafc0d6-9155-456a-b63d-b5c1944fb51c-kube-api-access-wm7c4\") pod \"dnsmasq-dns-96fb4d4c9-mr8wv\" (UID: \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\") " pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.918983 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-ovsdbserver-sb\") pod \"dnsmasq-dns-96fb4d4c9-mr8wv\" (UID: \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\") " pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.919010 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-dns-svc\") pod \"dnsmasq-dns-96fb4d4c9-mr8wv\" (UID: \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\") " pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.919032 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-config\") pod \"dnsmasq-dns-96fb4d4c9-mr8wv\" (UID: \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\") " pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.920092 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-config\") pod \"dnsmasq-dns-96fb4d4c9-mr8wv\" (UID: \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\") " pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.920345 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-dns-swift-storage-0\") pod \"dnsmasq-dns-96fb4d4c9-mr8wv\" (UID: \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\") " pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.920951 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-ovsdbserver-sb\") pod \"dnsmasq-dns-96fb4d4c9-mr8wv\" (UID: \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\") " pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.921035 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-ovsdbserver-nb\") pod \"dnsmasq-dns-96fb4d4c9-mr8wv\" (UID: \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\") " pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.921634 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-dns-svc\") pod \"dnsmasq-dns-96fb4d4c9-mr8wv\" (UID: \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\") " pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.936477 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm7c4\" (UniqueName: \"kubernetes.io/projected/bcafc0d6-9155-456a-b63d-b5c1944fb51c-kube-api-access-wm7c4\") pod \"dnsmasq-dns-96fb4d4c9-mr8wv\" (UID: \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\") " pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" Feb 17 09:21:00 crc kubenswrapper[4848]: I0217 09:21:00.948249 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" Feb 17 09:21:01 crc kubenswrapper[4848]: I0217 09:21:01.405077 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-96fb4d4c9-mr8wv"] Feb 17 09:21:01 crc kubenswrapper[4848]: W0217 09:21:01.406476 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcafc0d6_9155_456a_b63d_b5c1944fb51c.slice/crio-2cb349136673532d0937ad1f21364bf9e2e571f0d03e6177b550ee568248d9b2 WatchSource:0}: Error finding container 2cb349136673532d0937ad1f21364bf9e2e571f0d03e6177b550ee568248d9b2: Status 404 returned error can't find the container with id 2cb349136673532d0937ad1f21364bf9e2e571f0d03e6177b550ee568248d9b2 Feb 17 09:21:02 crc kubenswrapper[4848]: I0217 09:21:02.105688 4848 generic.go:334] "Generic (PLEG): container finished" podID="bcafc0d6-9155-456a-b63d-b5c1944fb51c" containerID="915958aff7309f7c86128c894bb730fb7b32670baf1c2043ad27102c1fd5c13d" exitCode=0 Feb 17 09:21:02 crc kubenswrapper[4848]: I0217 09:21:02.105751 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" event={"ID":"bcafc0d6-9155-456a-b63d-b5c1944fb51c","Type":"ContainerDied","Data":"915958aff7309f7c86128c894bb730fb7b32670baf1c2043ad27102c1fd5c13d"} Feb 17 09:21:02 crc kubenswrapper[4848]: I0217 09:21:02.106181 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" event={"ID":"bcafc0d6-9155-456a-b63d-b5c1944fb51c","Type":"ContainerStarted","Data":"2cb349136673532d0937ad1f21364bf9e2e571f0d03e6177b550ee568248d9b2"} Feb 17 09:21:02 crc kubenswrapper[4848]: I0217 09:21:02.106284 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-778944759-7769n" podUID="aaf65cec-c416-49a5-87db-11d9713a96ed" containerName="dnsmasq-dns" containerID="cri-o://85f89e1d610cfea829d07a20290dc1d9b0dc413d8d614e8c04a2ca0f16a3135d" gracePeriod=10 Feb 17 09:21:02 crc kubenswrapper[4848]: I0217 09:21:02.548080 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-778944759-7769n" Feb 17 09:21:02 crc kubenswrapper[4848]: I0217 09:21:02.646464 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-config\") pod \"aaf65cec-c416-49a5-87db-11d9713a96ed\" (UID: \"aaf65cec-c416-49a5-87db-11d9713a96ed\") " Feb 17 09:21:02 crc kubenswrapper[4848]: I0217 09:21:02.646631 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6xqm\" (UniqueName: \"kubernetes.io/projected/aaf65cec-c416-49a5-87db-11d9713a96ed-kube-api-access-f6xqm\") pod \"aaf65cec-c416-49a5-87db-11d9713a96ed\" (UID: \"aaf65cec-c416-49a5-87db-11d9713a96ed\") " Feb 17 09:21:02 crc kubenswrapper[4848]: I0217 09:21:02.646672 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-dns-swift-storage-0\") pod \"aaf65cec-c416-49a5-87db-11d9713a96ed\" (UID: \"aaf65cec-c416-49a5-87db-11d9713a96ed\") " Feb 17 09:21:02 crc kubenswrapper[4848]: I0217 09:21:02.646708 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-ovsdbserver-nb\") pod \"aaf65cec-c416-49a5-87db-11d9713a96ed\" (UID: \"aaf65cec-c416-49a5-87db-11d9713a96ed\") " Feb 17 09:21:02 crc kubenswrapper[4848]: I0217 09:21:02.646751 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-ovsdbserver-sb\") pod \"aaf65cec-c416-49a5-87db-11d9713a96ed\" (UID: \"aaf65cec-c416-49a5-87db-11d9713a96ed\") " Feb 17 09:21:02 crc kubenswrapper[4848]: I0217 09:21:02.646826 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-dns-svc\") pod \"aaf65cec-c416-49a5-87db-11d9713a96ed\" (UID: \"aaf65cec-c416-49a5-87db-11d9713a96ed\") " Feb 17 09:21:02 crc kubenswrapper[4848]: I0217 09:21:02.650939 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aaf65cec-c416-49a5-87db-11d9713a96ed-kube-api-access-f6xqm" (OuterVolumeSpecName: "kube-api-access-f6xqm") pod "aaf65cec-c416-49a5-87db-11d9713a96ed" (UID: "aaf65cec-c416-49a5-87db-11d9713a96ed"). InnerVolumeSpecName "kube-api-access-f6xqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:21:02 crc kubenswrapper[4848]: I0217 09:21:02.688666 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aaf65cec-c416-49a5-87db-11d9713a96ed" (UID: "aaf65cec-c416-49a5-87db-11d9713a96ed"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:21:02 crc kubenswrapper[4848]: I0217 09:21:02.690874 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-config" (OuterVolumeSpecName: "config") pod "aaf65cec-c416-49a5-87db-11d9713a96ed" (UID: "aaf65cec-c416-49a5-87db-11d9713a96ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:21:02 crc kubenswrapper[4848]: I0217 09:21:02.693213 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aaf65cec-c416-49a5-87db-11d9713a96ed" (UID: "aaf65cec-c416-49a5-87db-11d9713a96ed"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:21:02 crc kubenswrapper[4848]: I0217 09:21:02.693734 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aaf65cec-c416-49a5-87db-11d9713a96ed" (UID: "aaf65cec-c416-49a5-87db-11d9713a96ed"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:21:02 crc kubenswrapper[4848]: I0217 09:21:02.696021 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aaf65cec-c416-49a5-87db-11d9713a96ed" (UID: "aaf65cec-c416-49a5-87db-11d9713a96ed"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:21:02 crc kubenswrapper[4848]: I0217 09:21:02.748667 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:02 crc kubenswrapper[4848]: I0217 09:21:02.748955 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6xqm\" (UniqueName: \"kubernetes.io/projected/aaf65cec-c416-49a5-87db-11d9713a96ed-kube-api-access-f6xqm\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:02 crc kubenswrapper[4848]: I0217 09:21:02.748966 4848 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:02 crc kubenswrapper[4848]: I0217 09:21:02.748974 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:02 crc kubenswrapper[4848]: I0217 09:21:02.748991 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:02 crc kubenswrapper[4848]: I0217 09:21:02.748999 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aaf65cec-c416-49a5-87db-11d9713a96ed-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:03 crc kubenswrapper[4848]: I0217 09:21:03.115933 4848 generic.go:334] "Generic (PLEG): container finished" podID="aaf65cec-c416-49a5-87db-11d9713a96ed" containerID="85f89e1d610cfea829d07a20290dc1d9b0dc413d8d614e8c04a2ca0f16a3135d" exitCode=0 Feb 17 09:21:03 crc kubenswrapper[4848]: I0217 09:21:03.115984 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-778944759-7769n" Feb 17 09:21:03 crc kubenswrapper[4848]: I0217 09:21:03.116003 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778944759-7769n" event={"ID":"aaf65cec-c416-49a5-87db-11d9713a96ed","Type":"ContainerDied","Data":"85f89e1d610cfea829d07a20290dc1d9b0dc413d8d614e8c04a2ca0f16a3135d"} Feb 17 09:21:03 crc kubenswrapper[4848]: I0217 09:21:03.116045 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778944759-7769n" event={"ID":"aaf65cec-c416-49a5-87db-11d9713a96ed","Type":"ContainerDied","Data":"8767a74883d5247e94758f952c587007a2e404dbc53034e8090b54ccfba98376"} Feb 17 09:21:03 crc kubenswrapper[4848]: I0217 09:21:03.116077 4848 scope.go:117] "RemoveContainer" containerID="85f89e1d610cfea829d07a20290dc1d9b0dc413d8d614e8c04a2ca0f16a3135d" Feb 17 09:21:03 crc kubenswrapper[4848]: I0217 09:21:03.119383 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" event={"ID":"bcafc0d6-9155-456a-b63d-b5c1944fb51c","Type":"ContainerStarted","Data":"9a16f27c63c8fd245ee9c7bf24dd8991f885618bde2a6b3c6564277b4466e7e1"} Feb 17 09:21:03 crc kubenswrapper[4848]: I0217 09:21:03.119903 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" Feb 17 09:21:03 crc kubenswrapper[4848]: I0217 09:21:03.134553 4848 scope.go:117] "RemoveContainer" containerID="09e273efe8636d666c360a92b5db7aca79690867393055a4546982fbd8c16bb4" Feb 17 09:21:03 crc kubenswrapper[4848]: I0217 09:21:03.151855 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" podStartSLOduration=3.151828094 podStartE2EDuration="3.151828094s" podCreationTimestamp="2026-02-17 09:21:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:21:03.14721768 +0000 UTC m=+940.690473336" watchObservedRunningTime="2026-02-17 09:21:03.151828094 +0000 UTC m=+940.695083760" Feb 17 09:21:03 crc kubenswrapper[4848]: I0217 09:21:03.168961 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-778944759-7769n"] Feb 17 09:21:03 crc kubenswrapper[4848]: I0217 09:21:03.170954 4848 scope.go:117] "RemoveContainer" containerID="85f89e1d610cfea829d07a20290dc1d9b0dc413d8d614e8c04a2ca0f16a3135d" Feb 17 09:21:03 crc kubenswrapper[4848]: E0217 09:21:03.171371 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85f89e1d610cfea829d07a20290dc1d9b0dc413d8d614e8c04a2ca0f16a3135d\": container with ID starting with 85f89e1d610cfea829d07a20290dc1d9b0dc413d8d614e8c04a2ca0f16a3135d not found: ID does not exist" containerID="85f89e1d610cfea829d07a20290dc1d9b0dc413d8d614e8c04a2ca0f16a3135d" Feb 17 09:21:03 crc kubenswrapper[4848]: I0217 09:21:03.171418 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85f89e1d610cfea829d07a20290dc1d9b0dc413d8d614e8c04a2ca0f16a3135d"} err="failed to get container status \"85f89e1d610cfea829d07a20290dc1d9b0dc413d8d614e8c04a2ca0f16a3135d\": rpc error: code = NotFound desc = could not find container \"85f89e1d610cfea829d07a20290dc1d9b0dc413d8d614e8c04a2ca0f16a3135d\": container with ID starting with 85f89e1d610cfea829d07a20290dc1d9b0dc413d8d614e8c04a2ca0f16a3135d not found: ID does not exist" Feb 17 09:21:03 crc kubenswrapper[4848]: I0217 09:21:03.171450 4848 scope.go:117] "RemoveContainer" containerID="09e273efe8636d666c360a92b5db7aca79690867393055a4546982fbd8c16bb4" Feb 17 09:21:03 crc kubenswrapper[4848]: E0217 09:21:03.171726 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09e273efe8636d666c360a92b5db7aca79690867393055a4546982fbd8c16bb4\": container with ID starting with 09e273efe8636d666c360a92b5db7aca79690867393055a4546982fbd8c16bb4 not found: ID does not exist" containerID="09e273efe8636d666c360a92b5db7aca79690867393055a4546982fbd8c16bb4" Feb 17 09:21:03 crc kubenswrapper[4848]: I0217 09:21:03.171754 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09e273efe8636d666c360a92b5db7aca79690867393055a4546982fbd8c16bb4"} err="failed to get container status \"09e273efe8636d666c360a92b5db7aca79690867393055a4546982fbd8c16bb4\": rpc error: code = NotFound desc = could not find container \"09e273efe8636d666c360a92b5db7aca79690867393055a4546982fbd8c16bb4\": container with ID starting with 09e273efe8636d666c360a92b5db7aca79690867393055a4546982fbd8c16bb4 not found: ID does not exist" Feb 17 09:21:03 crc kubenswrapper[4848]: I0217 09:21:03.180928 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-778944759-7769n"] Feb 17 09:21:03 crc kubenswrapper[4848]: I0217 09:21:03.394568 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aaf65cec-c416-49a5-87db-11d9713a96ed" path="/var/lib/kubelet/pods/aaf65cec-c416-49a5-87db-11d9713a96ed/volumes" Feb 17 09:21:04 crc kubenswrapper[4848]: I0217 09:21:04.620259 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8rrkh" Feb 17 09:21:04 crc kubenswrapper[4848]: I0217 09:21:04.621981 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8rrkh" Feb 17 09:21:04 crc kubenswrapper[4848]: I0217 09:21:04.990635 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r98sw"] Feb 17 09:21:04 crc kubenswrapper[4848]: E0217 09:21:04.991192 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaf65cec-c416-49a5-87db-11d9713a96ed" containerName="dnsmasq-dns" Feb 17 09:21:04 crc kubenswrapper[4848]: I0217 09:21:04.991225 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaf65cec-c416-49a5-87db-11d9713a96ed" containerName="dnsmasq-dns" Feb 17 09:21:04 crc kubenswrapper[4848]: E0217 09:21:04.991246 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aaf65cec-c416-49a5-87db-11d9713a96ed" containerName="init" Feb 17 09:21:04 crc kubenswrapper[4848]: I0217 09:21:04.991253 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="aaf65cec-c416-49a5-87db-11d9713a96ed" containerName="init" Feb 17 09:21:04 crc kubenswrapper[4848]: I0217 09:21:04.991430 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="aaf65cec-c416-49a5-87db-11d9713a96ed" containerName="dnsmasq-dns" Feb 17 09:21:04 crc kubenswrapper[4848]: I0217 09:21:04.992567 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r98sw" Feb 17 09:21:05 crc kubenswrapper[4848]: I0217 09:21:05.011233 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r98sw"] Feb 17 09:21:05 crc kubenswrapper[4848]: I0217 09:21:05.081857 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a097e9f6-c779-42e2-8bbf-767934d60341-catalog-content\") pod \"certified-operators-r98sw\" (UID: \"a097e9f6-c779-42e2-8bbf-767934d60341\") " pod="openshift-marketplace/certified-operators-r98sw" Feb 17 09:21:05 crc kubenswrapper[4848]: I0217 09:21:05.082124 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdkpv\" (UniqueName: \"kubernetes.io/projected/a097e9f6-c779-42e2-8bbf-767934d60341-kube-api-access-xdkpv\") pod \"certified-operators-r98sw\" (UID: \"a097e9f6-c779-42e2-8bbf-767934d60341\") " pod="openshift-marketplace/certified-operators-r98sw" Feb 17 09:21:05 crc kubenswrapper[4848]: I0217 09:21:05.082258 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a097e9f6-c779-42e2-8bbf-767934d60341-utilities\") pod \"certified-operators-r98sw\" (UID: \"a097e9f6-c779-42e2-8bbf-767934d60341\") " pod="openshift-marketplace/certified-operators-r98sw" Feb 17 09:21:05 crc kubenswrapper[4848]: I0217 09:21:05.184839 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdkpv\" (UniqueName: \"kubernetes.io/projected/a097e9f6-c779-42e2-8bbf-767934d60341-kube-api-access-xdkpv\") pod \"certified-operators-r98sw\" (UID: \"a097e9f6-c779-42e2-8bbf-767934d60341\") " pod="openshift-marketplace/certified-operators-r98sw" Feb 17 09:21:05 crc kubenswrapper[4848]: I0217 09:21:05.184926 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a097e9f6-c779-42e2-8bbf-767934d60341-utilities\") pod \"certified-operators-r98sw\" (UID: \"a097e9f6-c779-42e2-8bbf-767934d60341\") " pod="openshift-marketplace/certified-operators-r98sw" Feb 17 09:21:05 crc kubenswrapper[4848]: I0217 09:21:05.185051 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a097e9f6-c779-42e2-8bbf-767934d60341-catalog-content\") pod \"certified-operators-r98sw\" (UID: \"a097e9f6-c779-42e2-8bbf-767934d60341\") " pod="openshift-marketplace/certified-operators-r98sw" Feb 17 09:21:05 crc kubenswrapper[4848]: I0217 09:21:05.185569 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a097e9f6-c779-42e2-8bbf-767934d60341-catalog-content\") pod \"certified-operators-r98sw\" (UID: \"a097e9f6-c779-42e2-8bbf-767934d60341\") " pod="openshift-marketplace/certified-operators-r98sw" Feb 17 09:21:05 crc kubenswrapper[4848]: I0217 09:21:05.185680 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a097e9f6-c779-42e2-8bbf-767934d60341-utilities\") pod \"certified-operators-r98sw\" (UID: \"a097e9f6-c779-42e2-8bbf-767934d60341\") " pod="openshift-marketplace/certified-operators-r98sw" Feb 17 09:21:05 crc kubenswrapper[4848]: I0217 09:21:05.209333 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdkpv\" (UniqueName: \"kubernetes.io/projected/a097e9f6-c779-42e2-8bbf-767934d60341-kube-api-access-xdkpv\") pod \"certified-operators-r98sw\" (UID: \"a097e9f6-c779-42e2-8bbf-767934d60341\") " pod="openshift-marketplace/certified-operators-r98sw" Feb 17 09:21:05 crc kubenswrapper[4848]: I0217 09:21:05.318065 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r98sw" Feb 17 09:21:05 crc kubenswrapper[4848]: I0217 09:21:05.671576 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8rrkh" podUID="d6791ad5-1ad5-45f3-a8a4-f16b14013bba" containerName="registry-server" probeResult="failure" output=< Feb 17 09:21:05 crc kubenswrapper[4848]: timeout: failed to connect service ":50051" within 1s Feb 17 09:21:05 crc kubenswrapper[4848]: > Feb 17 09:21:05 crc kubenswrapper[4848]: I0217 09:21:05.833468 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r98sw"] Feb 17 09:21:06 crc kubenswrapper[4848]: I0217 09:21:06.144175 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r98sw" event={"ID":"a097e9f6-c779-42e2-8bbf-767934d60341","Type":"ContainerStarted","Data":"e4a508a5968beef0972cea08167f706605052013157cad62b0096c07c6c6224f"} Feb 17 09:21:06 crc kubenswrapper[4848]: I0217 09:21:06.144213 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r98sw" event={"ID":"a097e9f6-c779-42e2-8bbf-767934d60341","Type":"ContainerStarted","Data":"9bc0322ba50760f17f36a258074edfecb1945790f9c8680f6b5f6fca59a951e1"} Feb 17 09:21:07 crc kubenswrapper[4848]: I0217 09:21:07.153368 4848 generic.go:334] "Generic (PLEG): container finished" podID="a097e9f6-c779-42e2-8bbf-767934d60341" containerID="e4a508a5968beef0972cea08167f706605052013157cad62b0096c07c6c6224f" exitCode=0 Feb 17 09:21:07 crc kubenswrapper[4848]: I0217 09:21:07.153471 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r98sw" event={"ID":"a097e9f6-c779-42e2-8bbf-767934d60341","Type":"ContainerDied","Data":"e4a508a5968beef0972cea08167f706605052013157cad62b0096c07c6c6224f"} Feb 17 09:21:08 crc kubenswrapper[4848]: I0217 09:21:08.164482 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r98sw" event={"ID":"a097e9f6-c779-42e2-8bbf-767934d60341","Type":"ContainerStarted","Data":"30c524ee874fd62fee31ffee4aeca3c764124a36cbdedc19eab19f7b76a4a219"} Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.177845 4848 generic.go:334] "Generic (PLEG): container finished" podID="a097e9f6-c779-42e2-8bbf-767934d60341" containerID="30c524ee874fd62fee31ffee4aeca3c764124a36cbdedc19eab19f7b76a4a219" exitCode=0 Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.177915 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r98sw" event={"ID":"a097e9f6-c779-42e2-8bbf-767934d60341","Type":"ContainerDied","Data":"30c524ee874fd62fee31ffee4aeca3c764124a36cbdedc19eab19f7b76a4a219"} Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.228045 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.638666 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-ffrb6"] Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.639876 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ffrb6" Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.657475 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-ffrb6"] Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.687012 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44dw7\" (UniqueName: \"kubernetes.io/projected/4a4b82e2-58b0-4b38-b7db-6882298598c4-kube-api-access-44dw7\") pod \"cinder-db-create-ffrb6\" (UID: \"4a4b82e2-58b0-4b38-b7db-6882298598c4\") " pod="openstack/cinder-db-create-ffrb6" Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.687101 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a4b82e2-58b0-4b38-b7db-6882298598c4-operator-scripts\") pod \"cinder-db-create-ffrb6\" (UID: \"4a4b82e2-58b0-4b38-b7db-6882298598c4\") " pod="openstack/cinder-db-create-ffrb6" Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.740900 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-030c-account-create-update-zhd9p"] Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.743394 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-030c-account-create-update-zhd9p" Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.746964 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.757062 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-030c-account-create-update-zhd9p"] Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.788632 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a4b82e2-58b0-4b38-b7db-6882298598c4-operator-scripts\") pod \"cinder-db-create-ffrb6\" (UID: \"4a4b82e2-58b0-4b38-b7db-6882298598c4\") " pod="openstack/cinder-db-create-ffrb6" Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.788850 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44dw7\" (UniqueName: \"kubernetes.io/projected/4a4b82e2-58b0-4b38-b7db-6882298598c4-kube-api-access-44dw7\") pod \"cinder-db-create-ffrb6\" (UID: \"4a4b82e2-58b0-4b38-b7db-6882298598c4\") " pod="openstack/cinder-db-create-ffrb6" Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.789861 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a4b82e2-58b0-4b38-b7db-6882298598c4-operator-scripts\") pod \"cinder-db-create-ffrb6\" (UID: \"4a4b82e2-58b0-4b38-b7db-6882298598c4\") " pod="openstack/cinder-db-create-ffrb6" Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.807977 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44dw7\" (UniqueName: \"kubernetes.io/projected/4a4b82e2-58b0-4b38-b7db-6882298598c4-kube-api-access-44dw7\") pod \"cinder-db-create-ffrb6\" (UID: \"4a4b82e2-58b0-4b38-b7db-6882298598c4\") " pod="openstack/cinder-db-create-ffrb6" Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.847450 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-605d-account-create-update-lscpt"] Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.848532 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-605d-account-create-update-lscpt" Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.849852 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.871258 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-s9chz"] Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.872226 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-s9chz" Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.895975 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-s9chz"] Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.897956 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2sg7\" (UniqueName: \"kubernetes.io/projected/272b2170-d012-47c9-9d08-1a696ef88165-kube-api-access-b2sg7\") pod \"cinder-030c-account-create-update-zhd9p\" (UID: \"272b2170-d012-47c9-9d08-1a696ef88165\") " pod="openstack/cinder-030c-account-create-update-zhd9p" Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.898043 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/272b2170-d012-47c9-9d08-1a696ef88165-operator-scripts\") pod \"cinder-030c-account-create-update-zhd9p\" (UID: \"272b2170-d012-47c9-9d08-1a696ef88165\") " pod="openstack/cinder-030c-account-create-update-zhd9p" Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.929192 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-605d-account-create-update-lscpt"] Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.960358 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ffrb6" Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.999022 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6hm7\" (UniqueName: \"kubernetes.io/projected/c6bc83da-3773-4c42-95b7-56c77fd0fdb1-kube-api-access-b6hm7\") pod \"barbican-605d-account-create-update-lscpt\" (UID: \"c6bc83da-3773-4c42-95b7-56c77fd0fdb1\") " pod="openstack/barbican-605d-account-create-update-lscpt" Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.999081 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/272b2170-d012-47c9-9d08-1a696ef88165-operator-scripts\") pod \"cinder-030c-account-create-update-zhd9p\" (UID: \"272b2170-d012-47c9-9d08-1a696ef88165\") " pod="openstack/cinder-030c-account-create-update-zhd9p" Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.999142 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6bc83da-3773-4c42-95b7-56c77fd0fdb1-operator-scripts\") pod \"barbican-605d-account-create-update-lscpt\" (UID: \"c6bc83da-3773-4c42-95b7-56c77fd0fdb1\") " pod="openstack/barbican-605d-account-create-update-lscpt" Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.999183 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46v58\" (UniqueName: \"kubernetes.io/projected/e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00-kube-api-access-46v58\") pod \"barbican-db-create-s9chz\" (UID: \"e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00\") " pod="openstack/barbican-db-create-s9chz" Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.999209 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2sg7\" (UniqueName: \"kubernetes.io/projected/272b2170-d012-47c9-9d08-1a696ef88165-kube-api-access-b2sg7\") pod \"cinder-030c-account-create-update-zhd9p\" (UID: \"272b2170-d012-47c9-9d08-1a696ef88165\") " pod="openstack/cinder-030c-account-create-update-zhd9p" Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.999225 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00-operator-scripts\") pod \"barbican-db-create-s9chz\" (UID: \"e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00\") " pod="openstack/barbican-db-create-s9chz" Feb 17 09:21:09 crc kubenswrapper[4848]: I0217 09:21:09.999795 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/272b2170-d012-47c9-9d08-1a696ef88165-operator-scripts\") pod \"cinder-030c-account-create-update-zhd9p\" (UID: \"272b2170-d012-47c9-9d08-1a696ef88165\") " pod="openstack/cinder-030c-account-create-update-zhd9p" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.020665 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-jjhrd"] Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.021605 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jjhrd" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.030220 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.030408 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.030507 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.038278 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2sg7\" (UniqueName: \"kubernetes.io/projected/272b2170-d012-47c9-9d08-1a696ef88165-kube-api-access-b2sg7\") pod \"cinder-030c-account-create-update-zhd9p\" (UID: \"272b2170-d012-47c9-9d08-1a696ef88165\") " pod="openstack/cinder-030c-account-create-update-zhd9p" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.041752 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-p6mz5" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.050822 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jjhrd"] Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.057332 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-030c-account-create-update-zhd9p" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.060975 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-4s7cd"] Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.062122 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4s7cd" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.077910 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-c50b-account-create-update-lvn7w"] Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.079119 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c50b-account-create-update-lvn7w" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.083015 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.100640 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6bc83da-3773-4c42-95b7-56c77fd0fdb1-operator-scripts\") pod \"barbican-605d-account-create-update-lscpt\" (UID: \"c6bc83da-3773-4c42-95b7-56c77fd0fdb1\") " pod="openstack/barbican-605d-account-create-update-lscpt" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.100700 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46v58\" (UniqueName: \"kubernetes.io/projected/e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00-kube-api-access-46v58\") pod \"barbican-db-create-s9chz\" (UID: \"e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00\") " pod="openstack/barbican-db-create-s9chz" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.100735 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00-operator-scripts\") pod \"barbican-db-create-s9chz\" (UID: \"e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00\") " pod="openstack/barbican-db-create-s9chz" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.100783 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6hm7\" (UniqueName: \"kubernetes.io/projected/c6bc83da-3773-4c42-95b7-56c77fd0fdb1-kube-api-access-b6hm7\") pod \"barbican-605d-account-create-update-lscpt\" (UID: \"c6bc83da-3773-4c42-95b7-56c77fd0fdb1\") " pod="openstack/barbican-605d-account-create-update-lscpt" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.101628 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6bc83da-3773-4c42-95b7-56c77fd0fdb1-operator-scripts\") pod \"barbican-605d-account-create-update-lscpt\" (UID: \"c6bc83da-3773-4c42-95b7-56c77fd0fdb1\") " pod="openstack/barbican-605d-account-create-update-lscpt" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.102231 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00-operator-scripts\") pod \"barbican-db-create-s9chz\" (UID: \"e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00\") " pod="openstack/barbican-db-create-s9chz" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.105833 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4s7cd"] Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.153400 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6hm7\" (UniqueName: \"kubernetes.io/projected/c6bc83da-3773-4c42-95b7-56c77fd0fdb1-kube-api-access-b6hm7\") pod \"barbican-605d-account-create-update-lscpt\" (UID: \"c6bc83da-3773-4c42-95b7-56c77fd0fdb1\") " pod="openstack/barbican-605d-account-create-update-lscpt" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.156917 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c50b-account-create-update-lvn7w"] Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.161134 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46v58\" (UniqueName: \"kubernetes.io/projected/e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00-kube-api-access-46v58\") pod \"barbican-db-create-s9chz\" (UID: \"e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00\") " pod="openstack/barbican-db-create-s9chz" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.202343 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82b8b5f5-7a6c-4a94-9255-510bd8ac99a3-operator-scripts\") pod \"neutron-c50b-account-create-update-lvn7w\" (UID: \"82b8b5f5-7a6c-4a94-9255-510bd8ac99a3\") " pod="openstack/neutron-c50b-account-create-update-lvn7w" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.202747 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvrwb\" (UniqueName: \"kubernetes.io/projected/82b8b5f5-7a6c-4a94-9255-510bd8ac99a3-kube-api-access-rvrwb\") pod \"neutron-c50b-account-create-update-lvn7w\" (UID: \"82b8b5f5-7a6c-4a94-9255-510bd8ac99a3\") " pod="openstack/neutron-c50b-account-create-update-lvn7w" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.202830 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab60454-0853-4ec5-ba88-78e220fab168-combined-ca-bundle\") pod \"keystone-db-sync-jjhrd\" (UID: \"eab60454-0853-4ec5-ba88-78e220fab168\") " pod="openstack/keystone-db-sync-jjhrd" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.202862 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27nx5\" (UniqueName: \"kubernetes.io/projected/eab60454-0853-4ec5-ba88-78e220fab168-kube-api-access-27nx5\") pod \"keystone-db-sync-jjhrd\" (UID: \"eab60454-0853-4ec5-ba88-78e220fab168\") " pod="openstack/keystone-db-sync-jjhrd" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.202886 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04dc10b3-8edf-4385-bdfa-24b322f8355e-operator-scripts\") pod \"neutron-db-create-4s7cd\" (UID: \"04dc10b3-8edf-4385-bdfa-24b322f8355e\") " pod="openstack/neutron-db-create-4s7cd" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.202928 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfdpx\" (UniqueName: \"kubernetes.io/projected/04dc10b3-8edf-4385-bdfa-24b322f8355e-kube-api-access-sfdpx\") pod \"neutron-db-create-4s7cd\" (UID: \"04dc10b3-8edf-4385-bdfa-24b322f8355e\") " pod="openstack/neutron-db-create-4s7cd" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.203014 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eab60454-0853-4ec5-ba88-78e220fab168-config-data\") pod \"keystone-db-sync-jjhrd\" (UID: \"eab60454-0853-4ec5-ba88-78e220fab168\") " pod="openstack/keystone-db-sync-jjhrd" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.206181 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-605d-account-create-update-lscpt" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.225204 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r98sw" event={"ID":"a097e9f6-c779-42e2-8bbf-767934d60341","Type":"ContainerStarted","Data":"0045bb83a18a7b56c8feaf6618b65f2f1b3fe11cb129f86bdc6604c4a7e5a9dd"} Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.233116 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-s9chz" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.264232 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r98sw" podStartSLOduration=3.802465946 podStartE2EDuration="6.264212985s" podCreationTimestamp="2026-02-17 09:21:04 +0000 UTC" firstStartedPulling="2026-02-17 09:21:07.15532146 +0000 UTC m=+944.698577116" lastFinishedPulling="2026-02-17 09:21:09.617068509 +0000 UTC m=+947.160324155" observedRunningTime="2026-02-17 09:21:10.262419303 +0000 UTC m=+947.805674949" watchObservedRunningTime="2026-02-17 09:21:10.264212985 +0000 UTC m=+947.807468631" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.305904 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27nx5\" (UniqueName: \"kubernetes.io/projected/eab60454-0853-4ec5-ba88-78e220fab168-kube-api-access-27nx5\") pod \"keystone-db-sync-jjhrd\" (UID: \"eab60454-0853-4ec5-ba88-78e220fab168\") " pod="openstack/keystone-db-sync-jjhrd" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.305978 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04dc10b3-8edf-4385-bdfa-24b322f8355e-operator-scripts\") pod \"neutron-db-create-4s7cd\" (UID: \"04dc10b3-8edf-4385-bdfa-24b322f8355e\") " pod="openstack/neutron-db-create-4s7cd" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.306054 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfdpx\" (UniqueName: \"kubernetes.io/projected/04dc10b3-8edf-4385-bdfa-24b322f8355e-kube-api-access-sfdpx\") pod \"neutron-db-create-4s7cd\" (UID: \"04dc10b3-8edf-4385-bdfa-24b322f8355e\") " pod="openstack/neutron-db-create-4s7cd" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.306232 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eab60454-0853-4ec5-ba88-78e220fab168-config-data\") pod \"keystone-db-sync-jjhrd\" (UID: \"eab60454-0853-4ec5-ba88-78e220fab168\") " pod="openstack/keystone-db-sync-jjhrd" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.306322 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82b8b5f5-7a6c-4a94-9255-510bd8ac99a3-operator-scripts\") pod \"neutron-c50b-account-create-update-lvn7w\" (UID: \"82b8b5f5-7a6c-4a94-9255-510bd8ac99a3\") " pod="openstack/neutron-c50b-account-create-update-lvn7w" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.306356 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvrwb\" (UniqueName: \"kubernetes.io/projected/82b8b5f5-7a6c-4a94-9255-510bd8ac99a3-kube-api-access-rvrwb\") pod \"neutron-c50b-account-create-update-lvn7w\" (UID: \"82b8b5f5-7a6c-4a94-9255-510bd8ac99a3\") " pod="openstack/neutron-c50b-account-create-update-lvn7w" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.306423 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab60454-0853-4ec5-ba88-78e220fab168-combined-ca-bundle\") pod \"keystone-db-sync-jjhrd\" (UID: \"eab60454-0853-4ec5-ba88-78e220fab168\") " pod="openstack/keystone-db-sync-jjhrd" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.307703 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04dc10b3-8edf-4385-bdfa-24b322f8355e-operator-scripts\") pod \"neutron-db-create-4s7cd\" (UID: \"04dc10b3-8edf-4385-bdfa-24b322f8355e\") " pod="openstack/neutron-db-create-4s7cd" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.308043 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82b8b5f5-7a6c-4a94-9255-510bd8ac99a3-operator-scripts\") pod \"neutron-c50b-account-create-update-lvn7w\" (UID: \"82b8b5f5-7a6c-4a94-9255-510bd8ac99a3\") " pod="openstack/neutron-c50b-account-create-update-lvn7w" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.317391 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eab60454-0853-4ec5-ba88-78e220fab168-config-data\") pod \"keystone-db-sync-jjhrd\" (UID: \"eab60454-0853-4ec5-ba88-78e220fab168\") " pod="openstack/keystone-db-sync-jjhrd" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.319176 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab60454-0853-4ec5-ba88-78e220fab168-combined-ca-bundle\") pod \"keystone-db-sync-jjhrd\" (UID: \"eab60454-0853-4ec5-ba88-78e220fab168\") " pod="openstack/keystone-db-sync-jjhrd" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.323633 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfdpx\" (UniqueName: \"kubernetes.io/projected/04dc10b3-8edf-4385-bdfa-24b322f8355e-kube-api-access-sfdpx\") pod \"neutron-db-create-4s7cd\" (UID: \"04dc10b3-8edf-4385-bdfa-24b322f8355e\") " pod="openstack/neutron-db-create-4s7cd" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.325069 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvrwb\" (UniqueName: \"kubernetes.io/projected/82b8b5f5-7a6c-4a94-9255-510bd8ac99a3-kube-api-access-rvrwb\") pod \"neutron-c50b-account-create-update-lvn7w\" (UID: \"82b8b5f5-7a6c-4a94-9255-510bd8ac99a3\") " pod="openstack/neutron-c50b-account-create-update-lvn7w" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.329002 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27nx5\" (UniqueName: \"kubernetes.io/projected/eab60454-0853-4ec5-ba88-78e220fab168-kube-api-access-27nx5\") pod \"keystone-db-sync-jjhrd\" (UID: \"eab60454-0853-4ec5-ba88-78e220fab168\") " pod="openstack/keystone-db-sync-jjhrd" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.471359 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jjhrd" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.496416 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4s7cd" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.510219 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c50b-account-create-update-lvn7w" Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.560534 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-ffrb6"] Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.769005 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-030c-account-create-update-zhd9p"] Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.834947 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-s9chz"] Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.867370 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-605d-account-create-update-lscpt"] Feb 17 09:21:10 crc kubenswrapper[4848]: I0217 09:21:10.950595 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.016077 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66b577f8c-nxk7q"] Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.016508 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" podUID="cfbaba75-c3cd-4281-903a-1e77c7409afc" containerName="dnsmasq-dns" containerID="cri-o://13e36b3a73c7897c0c9292939e1849490e5d428526341cb6d5d4106ef651477b" gracePeriod=10 Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.104361 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jjhrd"] Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.116263 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4s7cd"] Feb 17 09:21:11 crc kubenswrapper[4848]: W0217 09:21:11.129921 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeab60454_0853_4ec5_ba88_78e220fab168.slice/crio-7ac0dbbc88a2e3f659487838ebb699bb1e8695bf7cd7ec0381cd11d70d0768b0 WatchSource:0}: Error finding container 7ac0dbbc88a2e3f659487838ebb699bb1e8695bf7cd7ec0381cd11d70d0768b0: Status 404 returned error can't find the container with id 7ac0dbbc88a2e3f659487838ebb699bb1e8695bf7cd7ec0381cd11d70d0768b0 Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.246405 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-s9chz" event={"ID":"e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00","Type":"ContainerStarted","Data":"9f5736ebd81d8cf0a1c76a7ab5c957539f3ea45df0e0d5d2fafd0c53c488ece3"} Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.246446 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-s9chz" event={"ID":"e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00","Type":"ContainerStarted","Data":"22f407eb74a183211531ddb18832535857cb7756ed68e30102cc5351f9cd1dad"} Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.249509 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ffrb6" event={"ID":"4a4b82e2-58b0-4b38-b7db-6882298598c4","Type":"ContainerStarted","Data":"7c761436a615b2cc970e711cc84889ba2c0ac2004d65c43913dee28aac4499d8"} Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.249551 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ffrb6" event={"ID":"4a4b82e2-58b0-4b38-b7db-6882298598c4","Type":"ContainerStarted","Data":"8109b9eff41042ccf5afb8834bfc2c7629a51b20175fbd2bb9e8418134392092"} Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.251340 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-030c-account-create-update-zhd9p" event={"ID":"272b2170-d012-47c9-9d08-1a696ef88165","Type":"ContainerStarted","Data":"d9a98dc4add23ca4f47299741110c817f5de22f5be6b3864588e30ccaace9359"} Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.251374 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-030c-account-create-update-zhd9p" event={"ID":"272b2170-d012-47c9-9d08-1a696ef88165","Type":"ContainerStarted","Data":"5484d75a8aaebf72d0e538527508fb4a72cb3275e3d1796ddec9b63a917be412"} Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.252633 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4s7cd" event={"ID":"04dc10b3-8edf-4385-bdfa-24b322f8355e","Type":"ContainerStarted","Data":"7b17f523cd7d1b9a3da013e1438463a64c315b014f4091d72040f002e737fcd9"} Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.253963 4848 generic.go:334] "Generic (PLEG): container finished" podID="cfbaba75-c3cd-4281-903a-1e77c7409afc" containerID="13e36b3a73c7897c0c9292939e1849490e5d428526341cb6d5d4106ef651477b" exitCode=0 Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.254018 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" event={"ID":"cfbaba75-c3cd-4281-903a-1e77c7409afc","Type":"ContainerDied","Data":"13e36b3a73c7897c0c9292939e1849490e5d428526341cb6d5d4106ef651477b"} Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.255324 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-605d-account-create-update-lscpt" event={"ID":"c6bc83da-3773-4c42-95b7-56c77fd0fdb1","Type":"ContainerStarted","Data":"7133481f5a6206c7e497ea5e74f8de8c2602fbe71a5348186b90335e51516f40"} Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.255350 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-605d-account-create-update-lscpt" event={"ID":"c6bc83da-3773-4c42-95b7-56c77fd0fdb1","Type":"ContainerStarted","Data":"b2de8e1e59346d46f3f8ccc32a4b24e219aaa568b33e0746ee92a9cffc654971"} Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.257417 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jjhrd" event={"ID":"eab60454-0853-4ec5-ba88-78e220fab168","Type":"ContainerStarted","Data":"7ac0dbbc88a2e3f659487838ebb699bb1e8695bf7cd7ec0381cd11d70d0768b0"} Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.272196 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-c50b-account-create-update-lvn7w"] Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.275948 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-s9chz" podStartSLOduration=2.275928958 podStartE2EDuration="2.275928958s" podCreationTimestamp="2026-02-17 09:21:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:21:11.264366542 +0000 UTC m=+948.807622188" watchObservedRunningTime="2026-02-17 09:21:11.275928958 +0000 UTC m=+948.819184604" Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.300132 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-ffrb6" podStartSLOduration=2.30010833 podStartE2EDuration="2.30010833s" podCreationTimestamp="2026-02-17 09:21:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:21:11.283125057 +0000 UTC m=+948.826380703" watchObservedRunningTime="2026-02-17 09:21:11.30010833 +0000 UTC m=+948.843363976" Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.323040 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-605d-account-create-update-lscpt" podStartSLOduration=2.323021906 podStartE2EDuration="2.323021906s" podCreationTimestamp="2026-02-17 09:21:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:21:11.303714585 +0000 UTC m=+948.846970231" watchObservedRunningTime="2026-02-17 09:21:11.323021906 +0000 UTC m=+948.866277552" Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.325406 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-030c-account-create-update-zhd9p" podStartSLOduration=2.325396235 podStartE2EDuration="2.325396235s" podCreationTimestamp="2026-02-17 09:21:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:21:11.317981589 +0000 UTC m=+948.861237235" watchObservedRunningTime="2026-02-17 09:21:11.325396235 +0000 UTC m=+948.868651881" Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.747275 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.931919 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfbaba75-c3cd-4281-903a-1e77c7409afc-ovsdbserver-nb\") pod \"cfbaba75-c3cd-4281-903a-1e77c7409afc\" (UID: \"cfbaba75-c3cd-4281-903a-1e77c7409afc\") " Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.932005 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfbaba75-c3cd-4281-903a-1e77c7409afc-config\") pod \"cfbaba75-c3cd-4281-903a-1e77c7409afc\" (UID: \"cfbaba75-c3cd-4281-903a-1e77c7409afc\") " Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.932069 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfbaba75-c3cd-4281-903a-1e77c7409afc-ovsdbserver-sb\") pod \"cfbaba75-c3cd-4281-903a-1e77c7409afc\" (UID: \"cfbaba75-c3cd-4281-903a-1e77c7409afc\") " Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.933114 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfbaba75-c3cd-4281-903a-1e77c7409afc-dns-svc\") pod \"cfbaba75-c3cd-4281-903a-1e77c7409afc\" (UID: \"cfbaba75-c3cd-4281-903a-1e77c7409afc\") " Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.933272 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-98zkj\" (UniqueName: \"kubernetes.io/projected/cfbaba75-c3cd-4281-903a-1e77c7409afc-kube-api-access-98zkj\") pod \"cfbaba75-c3cd-4281-903a-1e77c7409afc\" (UID: \"cfbaba75-c3cd-4281-903a-1e77c7409afc\") " Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.944981 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfbaba75-c3cd-4281-903a-1e77c7409afc-kube-api-access-98zkj" (OuterVolumeSpecName: "kube-api-access-98zkj") pod "cfbaba75-c3cd-4281-903a-1e77c7409afc" (UID: "cfbaba75-c3cd-4281-903a-1e77c7409afc"). InnerVolumeSpecName "kube-api-access-98zkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.973410 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfbaba75-c3cd-4281-903a-1e77c7409afc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cfbaba75-c3cd-4281-903a-1e77c7409afc" (UID: "cfbaba75-c3cd-4281-903a-1e77c7409afc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.986015 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfbaba75-c3cd-4281-903a-1e77c7409afc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cfbaba75-c3cd-4281-903a-1e77c7409afc" (UID: "cfbaba75-c3cd-4281-903a-1e77c7409afc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:21:11 crc kubenswrapper[4848]: I0217 09:21:11.999444 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfbaba75-c3cd-4281-903a-1e77c7409afc-config" (OuterVolumeSpecName: "config") pod "cfbaba75-c3cd-4281-903a-1e77c7409afc" (UID: "cfbaba75-c3cd-4281-903a-1e77c7409afc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.002424 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfbaba75-c3cd-4281-903a-1e77c7409afc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cfbaba75-c3cd-4281-903a-1e77c7409afc" (UID: "cfbaba75-c3cd-4281-903a-1e77c7409afc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.035663 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cfbaba75-c3cd-4281-903a-1e77c7409afc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.035694 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cfbaba75-c3cd-4281-903a-1e77c7409afc-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.035707 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cfbaba75-c3cd-4281-903a-1e77c7409afc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.035721 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cfbaba75-c3cd-4281-903a-1e77c7409afc-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.035732 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-98zkj\" (UniqueName: \"kubernetes.io/projected/cfbaba75-c3cd-4281-903a-1e77c7409afc-kube-api-access-98zkj\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.273746 4848 generic.go:334] "Generic (PLEG): container finished" podID="04dc10b3-8edf-4385-bdfa-24b322f8355e" containerID="befb432b3470e7a1531ee9e2e1091f5d81c97049f871b923862d0c1fe43807b3" exitCode=0 Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.273894 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4s7cd" event={"ID":"04dc10b3-8edf-4385-bdfa-24b322f8355e","Type":"ContainerDied","Data":"befb432b3470e7a1531ee9e2e1091f5d81c97049f871b923862d0c1fe43807b3"} Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.277097 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" event={"ID":"cfbaba75-c3cd-4281-903a-1e77c7409afc","Type":"ContainerDied","Data":"bce38fa0d9306a714c4cadd92b11b9c9f14ff2dde397ee386e66a22ea610033d"} Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.277377 4848 scope.go:117] "RemoveContainer" containerID="13e36b3a73c7897c0c9292939e1849490e5d428526341cb6d5d4106ef651477b" Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.277783 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-66b577f8c-nxk7q" Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.299989 4848 generic.go:334] "Generic (PLEG): container finished" podID="c6bc83da-3773-4c42-95b7-56c77fd0fdb1" containerID="7133481f5a6206c7e497ea5e74f8de8c2602fbe71a5348186b90335e51516f40" exitCode=0 Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.300331 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-605d-account-create-update-lscpt" event={"ID":"c6bc83da-3773-4c42-95b7-56c77fd0fdb1","Type":"ContainerDied","Data":"7133481f5a6206c7e497ea5e74f8de8c2602fbe71a5348186b90335e51516f40"} Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.304130 4848 generic.go:334] "Generic (PLEG): container finished" podID="82b8b5f5-7a6c-4a94-9255-510bd8ac99a3" containerID="c96bc5281c3998f7bbcdded7e70ad475f607c37110ccef7975e872b7b223f6b9" exitCode=0 Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.304242 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c50b-account-create-update-lvn7w" event={"ID":"82b8b5f5-7a6c-4a94-9255-510bd8ac99a3","Type":"ContainerDied","Data":"c96bc5281c3998f7bbcdded7e70ad475f607c37110ccef7975e872b7b223f6b9"} Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.304272 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c50b-account-create-update-lvn7w" event={"ID":"82b8b5f5-7a6c-4a94-9255-510bd8ac99a3","Type":"ContainerStarted","Data":"d633376e21203455ab356eaaf45581b1d535146e10aaf823ff7a6fcfe3a08e45"} Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.307586 4848 generic.go:334] "Generic (PLEG): container finished" podID="e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00" containerID="9f5736ebd81d8cf0a1c76a7ab5c957539f3ea45df0e0d5d2fafd0c53c488ece3" exitCode=0 Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.307680 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-s9chz" event={"ID":"e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00","Type":"ContainerDied","Data":"9f5736ebd81d8cf0a1c76a7ab5c957539f3ea45df0e0d5d2fafd0c53c488ece3"} Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.313458 4848 generic.go:334] "Generic (PLEG): container finished" podID="4a4b82e2-58b0-4b38-b7db-6882298598c4" containerID="7c761436a615b2cc970e711cc84889ba2c0ac2004d65c43913dee28aac4499d8" exitCode=0 Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.313534 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ffrb6" event={"ID":"4a4b82e2-58b0-4b38-b7db-6882298598c4","Type":"ContainerDied","Data":"7c761436a615b2cc970e711cc84889ba2c0ac2004d65c43913dee28aac4499d8"} Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.318427 4848 generic.go:334] "Generic (PLEG): container finished" podID="272b2170-d012-47c9-9d08-1a696ef88165" containerID="d9a98dc4add23ca4f47299741110c817f5de22f5be6b3864588e30ccaace9359" exitCode=0 Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.318481 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-030c-account-create-update-zhd9p" event={"ID":"272b2170-d012-47c9-9d08-1a696ef88165","Type":"ContainerDied","Data":"d9a98dc4add23ca4f47299741110c817f5de22f5be6b3864588e30ccaace9359"} Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.329124 4848 scope.go:117] "RemoveContainer" containerID="0e6f8ff0d254a1c12057a1fc0d095620b587f7452d72c795827ad107faabf2a5" Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.398664 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-66b577f8c-nxk7q"] Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.416293 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-66b577f8c-nxk7q"] Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.956335 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-nvhmn"] Feb 17 09:21:12 crc kubenswrapper[4848]: E0217 09:21:12.956735 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfbaba75-c3cd-4281-903a-1e77c7409afc" containerName="init" Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.956756 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfbaba75-c3cd-4281-903a-1e77c7409afc" containerName="init" Feb 17 09:21:12 crc kubenswrapper[4848]: E0217 09:21:12.956786 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfbaba75-c3cd-4281-903a-1e77c7409afc" containerName="dnsmasq-dns" Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.956795 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfbaba75-c3cd-4281-903a-1e77c7409afc" containerName="dnsmasq-dns" Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.957096 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfbaba75-c3cd-4281-903a-1e77c7409afc" containerName="dnsmasq-dns" Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.960477 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nvhmn" Feb 17 09:21:12 crc kubenswrapper[4848]: I0217 09:21:12.971414 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nvhmn"] Feb 17 09:21:13 crc kubenswrapper[4848]: I0217 09:21:13.053307 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6j4s\" (UniqueName: \"kubernetes.io/projected/0a1f257f-6ee5-49f4-8c89-033b43dc561c-kube-api-access-m6j4s\") pod \"redhat-marketplace-nvhmn\" (UID: \"0a1f257f-6ee5-49f4-8c89-033b43dc561c\") " pod="openshift-marketplace/redhat-marketplace-nvhmn" Feb 17 09:21:13 crc kubenswrapper[4848]: I0217 09:21:13.053446 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a1f257f-6ee5-49f4-8c89-033b43dc561c-catalog-content\") pod \"redhat-marketplace-nvhmn\" (UID: \"0a1f257f-6ee5-49f4-8c89-033b43dc561c\") " pod="openshift-marketplace/redhat-marketplace-nvhmn" Feb 17 09:21:13 crc kubenswrapper[4848]: I0217 09:21:13.053472 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a1f257f-6ee5-49f4-8c89-033b43dc561c-utilities\") pod \"redhat-marketplace-nvhmn\" (UID: \"0a1f257f-6ee5-49f4-8c89-033b43dc561c\") " pod="openshift-marketplace/redhat-marketplace-nvhmn" Feb 17 09:21:13 crc kubenswrapper[4848]: I0217 09:21:13.155175 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6j4s\" (UniqueName: \"kubernetes.io/projected/0a1f257f-6ee5-49f4-8c89-033b43dc561c-kube-api-access-m6j4s\") pod \"redhat-marketplace-nvhmn\" (UID: \"0a1f257f-6ee5-49f4-8c89-033b43dc561c\") " pod="openshift-marketplace/redhat-marketplace-nvhmn" Feb 17 09:21:13 crc kubenswrapper[4848]: I0217 09:21:13.155286 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a1f257f-6ee5-49f4-8c89-033b43dc561c-catalog-content\") pod \"redhat-marketplace-nvhmn\" (UID: \"0a1f257f-6ee5-49f4-8c89-033b43dc561c\") " pod="openshift-marketplace/redhat-marketplace-nvhmn" Feb 17 09:21:13 crc kubenswrapper[4848]: I0217 09:21:13.155314 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a1f257f-6ee5-49f4-8c89-033b43dc561c-utilities\") pod \"redhat-marketplace-nvhmn\" (UID: \"0a1f257f-6ee5-49f4-8c89-033b43dc561c\") " pod="openshift-marketplace/redhat-marketplace-nvhmn" Feb 17 09:21:13 crc kubenswrapper[4848]: I0217 09:21:13.155995 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a1f257f-6ee5-49f4-8c89-033b43dc561c-utilities\") pod \"redhat-marketplace-nvhmn\" (UID: \"0a1f257f-6ee5-49f4-8c89-033b43dc561c\") " pod="openshift-marketplace/redhat-marketplace-nvhmn" Feb 17 09:21:13 crc kubenswrapper[4848]: I0217 09:21:13.156119 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a1f257f-6ee5-49f4-8c89-033b43dc561c-catalog-content\") pod \"redhat-marketplace-nvhmn\" (UID: \"0a1f257f-6ee5-49f4-8c89-033b43dc561c\") " pod="openshift-marketplace/redhat-marketplace-nvhmn" Feb 17 09:21:13 crc kubenswrapper[4848]: I0217 09:21:13.192324 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6j4s\" (UniqueName: \"kubernetes.io/projected/0a1f257f-6ee5-49f4-8c89-033b43dc561c-kube-api-access-m6j4s\") pod \"redhat-marketplace-nvhmn\" (UID: \"0a1f257f-6ee5-49f4-8c89-033b43dc561c\") " pod="openshift-marketplace/redhat-marketplace-nvhmn" Feb 17 09:21:13 crc kubenswrapper[4848]: I0217 09:21:13.283215 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nvhmn" Feb 17 09:21:13 crc kubenswrapper[4848]: I0217 09:21:13.402111 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfbaba75-c3cd-4281-903a-1e77c7409afc" path="/var/lib/kubelet/pods/cfbaba75-c3cd-4281-903a-1e77c7409afc/volumes" Feb 17 09:21:14 crc kubenswrapper[4848]: I0217 09:21:14.729713 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8rrkh" Feb 17 09:21:14 crc kubenswrapper[4848]: I0217 09:21:14.778132 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8rrkh" Feb 17 09:21:15 crc kubenswrapper[4848]: I0217 09:21:15.318733 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r98sw" Feb 17 09:21:15 crc kubenswrapper[4848]: I0217 09:21:15.318840 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r98sw" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.141316 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-605d-account-create-update-lscpt" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.182934 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4s7cd" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.208497 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-030c-account-create-update-zhd9p" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.219503 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c50b-account-create-update-lvn7w" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.220479 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ffrb6" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.236680 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-s9chz" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.343444 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44dw7\" (UniqueName: \"kubernetes.io/projected/4a4b82e2-58b0-4b38-b7db-6882298598c4-kube-api-access-44dw7\") pod \"4a4b82e2-58b0-4b38-b7db-6882298598c4\" (UID: \"4a4b82e2-58b0-4b38-b7db-6882298598c4\") " Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.343490 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvrwb\" (UniqueName: \"kubernetes.io/projected/82b8b5f5-7a6c-4a94-9255-510bd8ac99a3-kube-api-access-rvrwb\") pod \"82b8b5f5-7a6c-4a94-9255-510bd8ac99a3\" (UID: \"82b8b5f5-7a6c-4a94-9255-510bd8ac99a3\") " Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.343667 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00-operator-scripts\") pod \"e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00\" (UID: \"e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00\") " Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.343707 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04dc10b3-8edf-4385-bdfa-24b322f8355e-operator-scripts\") pod \"04dc10b3-8edf-4385-bdfa-24b322f8355e\" (UID: \"04dc10b3-8edf-4385-bdfa-24b322f8355e\") " Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.343805 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/272b2170-d012-47c9-9d08-1a696ef88165-operator-scripts\") pod \"272b2170-d012-47c9-9d08-1a696ef88165\" (UID: \"272b2170-d012-47c9-9d08-1a696ef88165\") " Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.343826 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6hm7\" (UniqueName: \"kubernetes.io/projected/c6bc83da-3773-4c42-95b7-56c77fd0fdb1-kube-api-access-b6hm7\") pod \"c6bc83da-3773-4c42-95b7-56c77fd0fdb1\" (UID: \"c6bc83da-3773-4c42-95b7-56c77fd0fdb1\") " Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.343910 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46v58\" (UniqueName: \"kubernetes.io/projected/e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00-kube-api-access-46v58\") pod \"e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00\" (UID: \"e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00\") " Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.343934 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a4b82e2-58b0-4b38-b7db-6882298598c4-operator-scripts\") pod \"4a4b82e2-58b0-4b38-b7db-6882298598c4\" (UID: \"4a4b82e2-58b0-4b38-b7db-6882298598c4\") " Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.343962 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82b8b5f5-7a6c-4a94-9255-510bd8ac99a3-operator-scripts\") pod \"82b8b5f5-7a6c-4a94-9255-510bd8ac99a3\" (UID: \"82b8b5f5-7a6c-4a94-9255-510bd8ac99a3\") " Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.344010 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6bc83da-3773-4c42-95b7-56c77fd0fdb1-operator-scripts\") pod \"c6bc83da-3773-4c42-95b7-56c77fd0fdb1\" (UID: \"c6bc83da-3773-4c42-95b7-56c77fd0fdb1\") " Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.344034 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2sg7\" (UniqueName: \"kubernetes.io/projected/272b2170-d012-47c9-9d08-1a696ef88165-kube-api-access-b2sg7\") pod \"272b2170-d012-47c9-9d08-1a696ef88165\" (UID: \"272b2170-d012-47c9-9d08-1a696ef88165\") " Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.344064 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfdpx\" (UniqueName: \"kubernetes.io/projected/04dc10b3-8edf-4385-bdfa-24b322f8355e-kube-api-access-sfdpx\") pod \"04dc10b3-8edf-4385-bdfa-24b322f8355e\" (UID: \"04dc10b3-8edf-4385-bdfa-24b322f8355e\") " Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.345885 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/82b8b5f5-7a6c-4a94-9255-510bd8ac99a3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "82b8b5f5-7a6c-4a94-9255-510bd8ac99a3" (UID: "82b8b5f5-7a6c-4a94-9255-510bd8ac99a3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.346042 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/272b2170-d012-47c9-9d08-1a696ef88165-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "272b2170-d012-47c9-9d08-1a696ef88165" (UID: "272b2170-d012-47c9-9d08-1a696ef88165"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.345990 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a4b82e2-58b0-4b38-b7db-6882298598c4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4a4b82e2-58b0-4b38-b7db-6882298598c4" (UID: "4a4b82e2-58b0-4b38-b7db-6882298598c4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.346126 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6bc83da-3773-4c42-95b7-56c77fd0fdb1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c6bc83da-3773-4c42-95b7-56c77fd0fdb1" (UID: "c6bc83da-3773-4c42-95b7-56c77fd0fdb1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.346145 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00" (UID: "e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.346292 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04dc10b3-8edf-4385-bdfa-24b322f8355e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "04dc10b3-8edf-4385-bdfa-24b322f8355e" (UID: "04dc10b3-8edf-4385-bdfa-24b322f8355e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.348617 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a4b82e2-58b0-4b38-b7db-6882298598c4-kube-api-access-44dw7" (OuterVolumeSpecName: "kube-api-access-44dw7") pod "4a4b82e2-58b0-4b38-b7db-6882298598c4" (UID: "4a4b82e2-58b0-4b38-b7db-6882298598c4"). InnerVolumeSpecName "kube-api-access-44dw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.349046 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00-kube-api-access-46v58" (OuterVolumeSpecName: "kube-api-access-46v58") pod "e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00" (UID: "e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00"). InnerVolumeSpecName "kube-api-access-46v58". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.349122 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04dc10b3-8edf-4385-bdfa-24b322f8355e-kube-api-access-sfdpx" (OuterVolumeSpecName: "kube-api-access-sfdpx") pod "04dc10b3-8edf-4385-bdfa-24b322f8355e" (UID: "04dc10b3-8edf-4385-bdfa-24b322f8355e"). InnerVolumeSpecName "kube-api-access-sfdpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.349994 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/272b2170-d012-47c9-9d08-1a696ef88165-kube-api-access-b2sg7" (OuterVolumeSpecName: "kube-api-access-b2sg7") pod "272b2170-d012-47c9-9d08-1a696ef88165" (UID: "272b2170-d012-47c9-9d08-1a696ef88165"). InnerVolumeSpecName "kube-api-access-b2sg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.351093 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6bc83da-3773-4c42-95b7-56c77fd0fdb1-kube-api-access-b6hm7" (OuterVolumeSpecName: "kube-api-access-b6hm7") pod "c6bc83da-3773-4c42-95b7-56c77fd0fdb1" (UID: "c6bc83da-3773-4c42-95b7-56c77fd0fdb1"). InnerVolumeSpecName "kube-api-access-b6hm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.353313 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82b8b5f5-7a6c-4a94-9255-510bd8ac99a3-kube-api-access-rvrwb" (OuterVolumeSpecName: "kube-api-access-rvrwb") pod "82b8b5f5-7a6c-4a94-9255-510bd8ac99a3" (UID: "82b8b5f5-7a6c-4a94-9255-510bd8ac99a3"). InnerVolumeSpecName "kube-api-access-rvrwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.357243 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4s7cd" event={"ID":"04dc10b3-8edf-4385-bdfa-24b322f8355e","Type":"ContainerDied","Data":"7b17f523cd7d1b9a3da013e1438463a64c315b014f4091d72040f002e737fcd9"} Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.357275 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b17f523cd7d1b9a3da013e1438463a64c315b014f4091d72040f002e737fcd9" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.357324 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4s7cd" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.362058 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-605d-account-create-update-lscpt" event={"ID":"c6bc83da-3773-4c42-95b7-56c77fd0fdb1","Type":"ContainerDied","Data":"b2de8e1e59346d46f3f8ccc32a4b24e219aaa568b33e0746ee92a9cffc654971"} Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.362103 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2de8e1e59346d46f3f8ccc32a4b24e219aaa568b33e0746ee92a9cffc654971" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.362181 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-605d-account-create-update-lscpt" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.363543 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-c50b-account-create-update-lvn7w" event={"ID":"82b8b5f5-7a6c-4a94-9255-510bd8ac99a3","Type":"ContainerDied","Data":"d633376e21203455ab356eaaf45581b1d535146e10aaf823ff7a6fcfe3a08e45"} Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.363561 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d633376e21203455ab356eaaf45581b1d535146e10aaf823ff7a6fcfe3a08e45" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.363607 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-c50b-account-create-update-lvn7w" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.374334 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-r98sw" podUID="a097e9f6-c779-42e2-8bbf-767934d60341" containerName="registry-server" probeResult="failure" output=< Feb 17 09:21:16 crc kubenswrapper[4848]: timeout: failed to connect service ":50051" within 1s Feb 17 09:21:16 crc kubenswrapper[4848]: > Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.375676 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jjhrd" event={"ID":"eab60454-0853-4ec5-ba88-78e220fab168","Type":"ContainerStarted","Data":"1904915061b9f32e5e6e94a3f18a6812fbce876f6d896d2ccf4f2ad3f2b986f8"} Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.377854 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-s9chz" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.377884 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-s9chz" event={"ID":"e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00","Type":"ContainerDied","Data":"22f407eb74a183211531ddb18832535857cb7756ed68e30102cc5351f9cd1dad"} Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.378906 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="22f407eb74a183211531ddb18832535857cb7756ed68e30102cc5351f9cd1dad" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.379544 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-ffrb6" event={"ID":"4a4b82e2-58b0-4b38-b7db-6882298598c4","Type":"ContainerDied","Data":"8109b9eff41042ccf5afb8834bfc2c7629a51b20175fbd2bb9e8418134392092"} Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.379569 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8109b9eff41042ccf5afb8834bfc2c7629a51b20175fbd2bb9e8418134392092" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.379605 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-ffrb6" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.380939 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-030c-account-create-update-zhd9p" event={"ID":"272b2170-d012-47c9-9d08-1a696ef88165","Type":"ContainerDied","Data":"5484d75a8aaebf72d0e538527508fb4a72cb3275e3d1796ddec9b63a917be412"} Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.380956 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-030c-account-create-update-zhd9p" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.380965 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5484d75a8aaebf72d0e538527508fb4a72cb3275e3d1796ddec9b63a917be412" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.390479 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-jjhrd" podStartSLOduration=2.528488194 podStartE2EDuration="7.390458734s" podCreationTimestamp="2026-02-17 09:21:09 +0000 UTC" firstStartedPulling="2026-02-17 09:21:11.157419226 +0000 UTC m=+948.700674872" lastFinishedPulling="2026-02-17 09:21:16.019389766 +0000 UTC m=+953.562645412" observedRunningTime="2026-02-17 09:21:16.3868833 +0000 UTC m=+953.930138966" watchObservedRunningTime="2026-02-17 09:21:16.390458734 +0000 UTC m=+953.933714390" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.446126 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfdpx\" (UniqueName: \"kubernetes.io/projected/04dc10b3-8edf-4385-bdfa-24b322f8355e-kube-api-access-sfdpx\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.446209 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44dw7\" (UniqueName: \"kubernetes.io/projected/4a4b82e2-58b0-4b38-b7db-6882298598c4-kube-api-access-44dw7\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.446258 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvrwb\" (UniqueName: \"kubernetes.io/projected/82b8b5f5-7a6c-4a94-9255-510bd8ac99a3-kube-api-access-rvrwb\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.446324 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.446353 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/04dc10b3-8edf-4385-bdfa-24b322f8355e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.446367 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6hm7\" (UniqueName: \"kubernetes.io/projected/c6bc83da-3773-4c42-95b7-56c77fd0fdb1-kube-api-access-b6hm7\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.446378 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/272b2170-d012-47c9-9d08-1a696ef88165-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.446388 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46v58\" (UniqueName: \"kubernetes.io/projected/e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00-kube-api-access-46v58\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.446396 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a4b82e2-58b0-4b38-b7db-6882298598c4-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.446405 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/82b8b5f5-7a6c-4a94-9255-510bd8ac99a3-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.446414 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6bc83da-3773-4c42-95b7-56c77fd0fdb1-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.446423 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2sg7\" (UniqueName: \"kubernetes.io/projected/272b2170-d012-47c9-9d08-1a696ef88165-kube-api-access-b2sg7\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.544381 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-nvhmn"] Feb 17 09:21:16 crc kubenswrapper[4848]: W0217 09:21:16.545327 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a1f257f_6ee5_49f4_8c89_033b43dc561c.slice/crio-719497241afed82086063d0bf0d3226e147b082c75f4d7783e47db4ba4a08db9 WatchSource:0}: Error finding container 719497241afed82086063d0bf0d3226e147b082c75f4d7783e47db4ba4a08db9: Status 404 returned error can't find the container with id 719497241afed82086063d0bf0d3226e147b082c75f4d7783e47db4ba4a08db9 Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.558708 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8rrkh"] Feb 17 09:21:16 crc kubenswrapper[4848]: I0217 09:21:16.558962 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8rrkh" podUID="d6791ad5-1ad5-45f3-a8a4-f16b14013bba" containerName="registry-server" containerID="cri-o://77087ae084579cd3541e26a3460dff7cd3b52005fa04147daec0b02417f2ac80" gracePeriod=2 Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.092260 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8rrkh" Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.172784 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6791ad5-1ad5-45f3-a8a4-f16b14013bba-utilities\") pod \"d6791ad5-1ad5-45f3-a8a4-f16b14013bba\" (UID: \"d6791ad5-1ad5-45f3-a8a4-f16b14013bba\") " Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.173010 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6791ad5-1ad5-45f3-a8a4-f16b14013bba-catalog-content\") pod \"d6791ad5-1ad5-45f3-a8a4-f16b14013bba\" (UID: \"d6791ad5-1ad5-45f3-a8a4-f16b14013bba\") " Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.173036 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx226\" (UniqueName: \"kubernetes.io/projected/d6791ad5-1ad5-45f3-a8a4-f16b14013bba-kube-api-access-qx226\") pod \"d6791ad5-1ad5-45f3-a8a4-f16b14013bba\" (UID: \"d6791ad5-1ad5-45f3-a8a4-f16b14013bba\") " Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.173707 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6791ad5-1ad5-45f3-a8a4-f16b14013bba-utilities" (OuterVolumeSpecName: "utilities") pod "d6791ad5-1ad5-45f3-a8a4-f16b14013bba" (UID: "d6791ad5-1ad5-45f3-a8a4-f16b14013bba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.183199 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6791ad5-1ad5-45f3-a8a4-f16b14013bba-kube-api-access-qx226" (OuterVolumeSpecName: "kube-api-access-qx226") pod "d6791ad5-1ad5-45f3-a8a4-f16b14013bba" (UID: "d6791ad5-1ad5-45f3-a8a4-f16b14013bba"). InnerVolumeSpecName "kube-api-access-qx226". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.274811 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qx226\" (UniqueName: \"kubernetes.io/projected/d6791ad5-1ad5-45f3-a8a4-f16b14013bba-kube-api-access-qx226\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.274840 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6791ad5-1ad5-45f3-a8a4-f16b14013bba-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.338040 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6791ad5-1ad5-45f3-a8a4-f16b14013bba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d6791ad5-1ad5-45f3-a8a4-f16b14013bba" (UID: "d6791ad5-1ad5-45f3-a8a4-f16b14013bba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.375893 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6791ad5-1ad5-45f3-a8a4-f16b14013bba-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.390199 4848 generic.go:334] "Generic (PLEG): container finished" podID="0a1f257f-6ee5-49f4-8c89-033b43dc561c" containerID="eb3c14030316fe503800a30601339d401b67e1276db8018c1910716c9e81b1da" exitCode=0 Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.393339 4848 generic.go:334] "Generic (PLEG): container finished" podID="d6791ad5-1ad5-45f3-a8a4-f16b14013bba" containerID="77087ae084579cd3541e26a3460dff7cd3b52005fa04147daec0b02417f2ac80" exitCode=0 Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.393463 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8rrkh" Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.394579 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvhmn" event={"ID":"0a1f257f-6ee5-49f4-8c89-033b43dc561c","Type":"ContainerDied","Data":"eb3c14030316fe503800a30601339d401b67e1276db8018c1910716c9e81b1da"} Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.394788 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvhmn" event={"ID":"0a1f257f-6ee5-49f4-8c89-033b43dc561c","Type":"ContainerStarted","Data":"719497241afed82086063d0bf0d3226e147b082c75f4d7783e47db4ba4a08db9"} Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.394964 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8rrkh" event={"ID":"d6791ad5-1ad5-45f3-a8a4-f16b14013bba","Type":"ContainerDied","Data":"77087ae084579cd3541e26a3460dff7cd3b52005fa04147daec0b02417f2ac80"} Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.395088 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8rrkh" event={"ID":"d6791ad5-1ad5-45f3-a8a4-f16b14013bba","Type":"ContainerDied","Data":"349ea0aa5945675d9b472fec86cfb7ac9a5a63efb3759a8c0a3bdfb86697a80b"} Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.395203 4848 scope.go:117] "RemoveContainer" containerID="77087ae084579cd3541e26a3460dff7cd3b52005fa04147daec0b02417f2ac80" Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.430747 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8rrkh"] Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.436541 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8rrkh"] Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.450510 4848 scope.go:117] "RemoveContainer" containerID="243d3710349f12c956818269bffb7761786bd802a2c2ee74a61e49ae0249d049" Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.478504 4848 scope.go:117] "RemoveContainer" containerID="6aed5c03cf91c0b35b7e8a6187368b5bee175fc3f6f99660ee5b0eb62da65973" Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.512936 4848 scope.go:117] "RemoveContainer" containerID="77087ae084579cd3541e26a3460dff7cd3b52005fa04147daec0b02417f2ac80" Feb 17 09:21:17 crc kubenswrapper[4848]: E0217 09:21:17.519205 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77087ae084579cd3541e26a3460dff7cd3b52005fa04147daec0b02417f2ac80\": container with ID starting with 77087ae084579cd3541e26a3460dff7cd3b52005fa04147daec0b02417f2ac80 not found: ID does not exist" containerID="77087ae084579cd3541e26a3460dff7cd3b52005fa04147daec0b02417f2ac80" Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.519247 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77087ae084579cd3541e26a3460dff7cd3b52005fa04147daec0b02417f2ac80"} err="failed to get container status \"77087ae084579cd3541e26a3460dff7cd3b52005fa04147daec0b02417f2ac80\": rpc error: code = NotFound desc = could not find container \"77087ae084579cd3541e26a3460dff7cd3b52005fa04147daec0b02417f2ac80\": container with ID starting with 77087ae084579cd3541e26a3460dff7cd3b52005fa04147daec0b02417f2ac80 not found: ID does not exist" Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.519275 4848 scope.go:117] "RemoveContainer" containerID="243d3710349f12c956818269bffb7761786bd802a2c2ee74a61e49ae0249d049" Feb 17 09:21:17 crc kubenswrapper[4848]: E0217 09:21:17.519718 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"243d3710349f12c956818269bffb7761786bd802a2c2ee74a61e49ae0249d049\": container with ID starting with 243d3710349f12c956818269bffb7761786bd802a2c2ee74a61e49ae0249d049 not found: ID does not exist" containerID="243d3710349f12c956818269bffb7761786bd802a2c2ee74a61e49ae0249d049" Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.519787 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"243d3710349f12c956818269bffb7761786bd802a2c2ee74a61e49ae0249d049"} err="failed to get container status \"243d3710349f12c956818269bffb7761786bd802a2c2ee74a61e49ae0249d049\": rpc error: code = NotFound desc = could not find container \"243d3710349f12c956818269bffb7761786bd802a2c2ee74a61e49ae0249d049\": container with ID starting with 243d3710349f12c956818269bffb7761786bd802a2c2ee74a61e49ae0249d049 not found: ID does not exist" Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.519825 4848 scope.go:117] "RemoveContainer" containerID="6aed5c03cf91c0b35b7e8a6187368b5bee175fc3f6f99660ee5b0eb62da65973" Feb 17 09:21:17 crc kubenswrapper[4848]: E0217 09:21:17.520235 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6aed5c03cf91c0b35b7e8a6187368b5bee175fc3f6f99660ee5b0eb62da65973\": container with ID starting with 6aed5c03cf91c0b35b7e8a6187368b5bee175fc3f6f99660ee5b0eb62da65973 not found: ID does not exist" containerID="6aed5c03cf91c0b35b7e8a6187368b5bee175fc3f6f99660ee5b0eb62da65973" Feb 17 09:21:17 crc kubenswrapper[4848]: I0217 09:21:17.520269 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6aed5c03cf91c0b35b7e8a6187368b5bee175fc3f6f99660ee5b0eb62da65973"} err="failed to get container status \"6aed5c03cf91c0b35b7e8a6187368b5bee175fc3f6f99660ee5b0eb62da65973\": rpc error: code = NotFound desc = could not find container \"6aed5c03cf91c0b35b7e8a6187368b5bee175fc3f6f99660ee5b0eb62da65973\": container with ID starting with 6aed5c03cf91c0b35b7e8a6187368b5bee175fc3f6f99660ee5b0eb62da65973 not found: ID does not exist" Feb 17 09:21:18 crc kubenswrapper[4848]: I0217 09:21:18.403390 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvhmn" event={"ID":"0a1f257f-6ee5-49f4-8c89-033b43dc561c","Type":"ContainerStarted","Data":"9f5df400784414601b0dcd5d77928319c121988940c983ab6db1f0026940739f"} Feb 17 09:21:19 crc kubenswrapper[4848]: I0217 09:21:19.408970 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6791ad5-1ad5-45f3-a8a4-f16b14013bba" path="/var/lib/kubelet/pods/d6791ad5-1ad5-45f3-a8a4-f16b14013bba/volumes" Feb 17 09:21:19 crc kubenswrapper[4848]: I0217 09:21:19.418577 4848 generic.go:334] "Generic (PLEG): container finished" podID="eab60454-0853-4ec5-ba88-78e220fab168" containerID="1904915061b9f32e5e6e94a3f18a6812fbce876f6d896d2ccf4f2ad3f2b986f8" exitCode=0 Feb 17 09:21:19 crc kubenswrapper[4848]: I0217 09:21:19.418804 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jjhrd" event={"ID":"eab60454-0853-4ec5-ba88-78e220fab168","Type":"ContainerDied","Data":"1904915061b9f32e5e6e94a3f18a6812fbce876f6d896d2ccf4f2ad3f2b986f8"} Feb 17 09:21:19 crc kubenswrapper[4848]: I0217 09:21:19.421984 4848 generic.go:334] "Generic (PLEG): container finished" podID="0a1f257f-6ee5-49f4-8c89-033b43dc561c" containerID="9f5df400784414601b0dcd5d77928319c121988940c983ab6db1f0026940739f" exitCode=0 Feb 17 09:21:19 crc kubenswrapper[4848]: I0217 09:21:19.422072 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvhmn" event={"ID":"0a1f257f-6ee5-49f4-8c89-033b43dc561c","Type":"ContainerDied","Data":"9f5df400784414601b0dcd5d77928319c121988940c983ab6db1f0026940739f"} Feb 17 09:21:20 crc kubenswrapper[4848]: I0217 09:21:20.432935 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvhmn" event={"ID":"0a1f257f-6ee5-49f4-8c89-033b43dc561c","Type":"ContainerStarted","Data":"7d00dd8337e77677691248151145d973a9aea39f30386d9a854e786330631945"} Feb 17 09:21:20 crc kubenswrapper[4848]: I0217 09:21:20.453307 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-nvhmn" podStartSLOduration=6.02265518 podStartE2EDuration="8.453289564s" podCreationTimestamp="2026-02-17 09:21:12 +0000 UTC" firstStartedPulling="2026-02-17 09:21:17.393443005 +0000 UTC m=+954.936698651" lastFinishedPulling="2026-02-17 09:21:19.824077379 +0000 UTC m=+957.367333035" observedRunningTime="2026-02-17 09:21:20.452100469 +0000 UTC m=+957.995356115" watchObservedRunningTime="2026-02-17 09:21:20.453289564 +0000 UTC m=+957.996545210" Feb 17 09:21:20 crc kubenswrapper[4848]: I0217 09:21:20.815786 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jjhrd" Feb 17 09:21:20 crc kubenswrapper[4848]: I0217 09:21:20.978248 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27nx5\" (UniqueName: \"kubernetes.io/projected/eab60454-0853-4ec5-ba88-78e220fab168-kube-api-access-27nx5\") pod \"eab60454-0853-4ec5-ba88-78e220fab168\" (UID: \"eab60454-0853-4ec5-ba88-78e220fab168\") " Feb 17 09:21:20 crc kubenswrapper[4848]: I0217 09:21:20.978291 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eab60454-0853-4ec5-ba88-78e220fab168-config-data\") pod \"eab60454-0853-4ec5-ba88-78e220fab168\" (UID: \"eab60454-0853-4ec5-ba88-78e220fab168\") " Feb 17 09:21:20 crc kubenswrapper[4848]: I0217 09:21:20.978384 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab60454-0853-4ec5-ba88-78e220fab168-combined-ca-bundle\") pod \"eab60454-0853-4ec5-ba88-78e220fab168\" (UID: \"eab60454-0853-4ec5-ba88-78e220fab168\") " Feb 17 09:21:20 crc kubenswrapper[4848]: I0217 09:21:20.989538 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eab60454-0853-4ec5-ba88-78e220fab168-kube-api-access-27nx5" (OuterVolumeSpecName: "kube-api-access-27nx5") pod "eab60454-0853-4ec5-ba88-78e220fab168" (UID: "eab60454-0853-4ec5-ba88-78e220fab168"). InnerVolumeSpecName "kube-api-access-27nx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.005296 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab60454-0853-4ec5-ba88-78e220fab168-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eab60454-0853-4ec5-ba88-78e220fab168" (UID: "eab60454-0853-4ec5-ba88-78e220fab168"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.023225 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eab60454-0853-4ec5-ba88-78e220fab168-config-data" (OuterVolumeSpecName: "config-data") pod "eab60454-0853-4ec5-ba88-78e220fab168" (UID: "eab60454-0853-4ec5-ba88-78e220fab168"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.080433 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eab60454-0853-4ec5-ba88-78e220fab168-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.080690 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27nx5\" (UniqueName: \"kubernetes.io/projected/eab60454-0853-4ec5-ba88-78e220fab168-kube-api-access-27nx5\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.080704 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eab60454-0853-4ec5-ba88-78e220fab168-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.442870 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jjhrd" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.443378 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jjhrd" event={"ID":"eab60454-0853-4ec5-ba88-78e220fab168","Type":"ContainerDied","Data":"7ac0dbbc88a2e3f659487838ebb699bb1e8695bf7cd7ec0381cd11d70d0768b0"} Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.443400 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ac0dbbc88a2e3f659487838ebb699bb1e8695bf7cd7ec0381cd11d70d0768b0" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.693129 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c4fdd6b7-k2vj5"] Feb 17 09:21:21 crc kubenswrapper[4848]: E0217 09:21:21.693418 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82b8b5f5-7a6c-4a94-9255-510bd8ac99a3" containerName="mariadb-account-create-update" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.693434 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="82b8b5f5-7a6c-4a94-9255-510bd8ac99a3" containerName="mariadb-account-create-update" Feb 17 09:21:21 crc kubenswrapper[4848]: E0217 09:21:21.693445 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="272b2170-d012-47c9-9d08-1a696ef88165" containerName="mariadb-account-create-update" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.693451 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="272b2170-d012-47c9-9d08-1a696ef88165" containerName="mariadb-account-create-update" Feb 17 09:21:21 crc kubenswrapper[4848]: E0217 09:21:21.693467 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6791ad5-1ad5-45f3-a8a4-f16b14013bba" containerName="extract-content" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.693473 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6791ad5-1ad5-45f3-a8a4-f16b14013bba" containerName="extract-content" Feb 17 09:21:21 crc kubenswrapper[4848]: E0217 09:21:21.693488 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6791ad5-1ad5-45f3-a8a4-f16b14013bba" containerName="extract-utilities" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.693493 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6791ad5-1ad5-45f3-a8a4-f16b14013bba" containerName="extract-utilities" Feb 17 09:21:21 crc kubenswrapper[4848]: E0217 09:21:21.693505 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a4b82e2-58b0-4b38-b7db-6882298598c4" containerName="mariadb-database-create" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.693511 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a4b82e2-58b0-4b38-b7db-6882298598c4" containerName="mariadb-database-create" Feb 17 09:21:21 crc kubenswrapper[4848]: E0217 09:21:21.693521 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eab60454-0853-4ec5-ba88-78e220fab168" containerName="keystone-db-sync" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.693528 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="eab60454-0853-4ec5-ba88-78e220fab168" containerName="keystone-db-sync" Feb 17 09:21:21 crc kubenswrapper[4848]: E0217 09:21:21.693539 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6bc83da-3773-4c42-95b7-56c77fd0fdb1" containerName="mariadb-account-create-update" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.693544 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6bc83da-3773-4c42-95b7-56c77fd0fdb1" containerName="mariadb-account-create-update" Feb 17 09:21:21 crc kubenswrapper[4848]: E0217 09:21:21.693553 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00" containerName="mariadb-database-create" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.693559 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00" containerName="mariadb-database-create" Feb 17 09:21:21 crc kubenswrapper[4848]: E0217 09:21:21.693566 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04dc10b3-8edf-4385-bdfa-24b322f8355e" containerName="mariadb-database-create" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.693571 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="04dc10b3-8edf-4385-bdfa-24b322f8355e" containerName="mariadb-database-create" Feb 17 09:21:21 crc kubenswrapper[4848]: E0217 09:21:21.693579 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6791ad5-1ad5-45f3-a8a4-f16b14013bba" containerName="registry-server" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.693585 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6791ad5-1ad5-45f3-a8a4-f16b14013bba" containerName="registry-server" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.693722 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="eab60454-0853-4ec5-ba88-78e220fab168" containerName="keystone-db-sync" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.693732 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6791ad5-1ad5-45f3-a8a4-f16b14013bba" containerName="registry-server" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.693741 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a4b82e2-58b0-4b38-b7db-6882298598c4" containerName="mariadb-database-create" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.693746 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6bc83da-3773-4c42-95b7-56c77fd0fdb1" containerName="mariadb-account-create-update" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.693828 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="04dc10b3-8edf-4385-bdfa-24b322f8355e" containerName="mariadb-database-create" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.693838 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="82b8b5f5-7a6c-4a94-9255-510bd8ac99a3" containerName="mariadb-account-create-update" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.693847 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="272b2170-d012-47c9-9d08-1a696ef88165" containerName="mariadb-account-create-update" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.693854 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00" containerName="mariadb-database-create" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.698081 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.707045 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c4fdd6b7-k2vj5"] Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.759336 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rb86p"] Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.760396 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rb86p" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.768639 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.768848 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.769078 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-p6mz5" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.769209 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.769313 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.777262 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rb86p"] Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.879323 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-844c79cc9c-crdch"] Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.887971 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-844c79cc9c-crdch" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.890352 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.890519 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-k7k5p" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.890675 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.892302 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-fernet-keys\") pod \"keystone-bootstrap-rb86p\" (UID: \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\") " pod="openstack/keystone-bootstrap-rb86p" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.892349 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-ovsdbserver-sb\") pod \"dnsmasq-dns-c4fdd6b7-k2vj5\" (UID: \"c0c27a71-d721-4332-a7f2-827797a0b19f\") " pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.892379 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-credential-keys\") pod \"keystone-bootstrap-rb86p\" (UID: \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\") " pod="openstack/keystone-bootstrap-rb86p" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.892473 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-combined-ca-bundle\") pod \"keystone-bootstrap-rb86p\" (UID: \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\") " pod="openstack/keystone-bootstrap-rb86p" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.892509 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-ovsdbserver-nb\") pod \"dnsmasq-dns-c4fdd6b7-k2vj5\" (UID: \"c0c27a71-d721-4332-a7f2-827797a0b19f\") " pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.892530 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-config-data\") pod \"keystone-bootstrap-rb86p\" (UID: \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\") " pod="openstack/keystone-bootstrap-rb86p" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.892648 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-dns-swift-storage-0\") pod \"dnsmasq-dns-c4fdd6b7-k2vj5\" (UID: \"c0c27a71-d721-4332-a7f2-827797a0b19f\") " pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.892734 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-config\") pod \"dnsmasq-dns-c4fdd6b7-k2vj5\" (UID: \"c0c27a71-d721-4332-a7f2-827797a0b19f\") " pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.892768 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-dns-svc\") pod \"dnsmasq-dns-c4fdd6b7-k2vj5\" (UID: \"c0c27a71-d721-4332-a7f2-827797a0b19f\") " pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.892790 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5jch\" (UniqueName: \"kubernetes.io/projected/c0c27a71-d721-4332-a7f2-827797a0b19f-kube-api-access-f5jch\") pod \"dnsmasq-dns-c4fdd6b7-k2vj5\" (UID: \"c0c27a71-d721-4332-a7f2-827797a0b19f\") " pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.892813 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-scripts\") pod \"keystone-bootstrap-rb86p\" (UID: \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\") " pod="openstack/keystone-bootstrap-rb86p" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.892846 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lznbm\" (UniqueName: \"kubernetes.io/projected/e920b3d7-6812-4953-8baf-e0039e7aa8a7-kube-api-access-lznbm\") pod \"keystone-bootstrap-rb86p\" (UID: \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\") " pod="openstack/keystone-bootstrap-rb86p" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.893074 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.913660 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-844c79cc9c-crdch"] Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.935823 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-zh2hd"] Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.936995 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zh2hd" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.941146 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.941346 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-nkq89" Feb 17 09:21:21 crc kubenswrapper[4848]: I0217 09:21:21.948187 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.005330 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lznbm\" (UniqueName: \"kubernetes.io/projected/e920b3d7-6812-4953-8baf-e0039e7aa8a7-kube-api-access-lznbm\") pod \"keystone-bootstrap-rb86p\" (UID: \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\") " pod="openstack/keystone-bootstrap-rb86p" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.005409 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-fernet-keys\") pod \"keystone-bootstrap-rb86p\" (UID: \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\") " pod="openstack/keystone-bootstrap-rb86p" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.005482 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/adcdc585-f670-4bc3-be54-acf796d438df-horizon-secret-key\") pod \"horizon-844c79cc9c-crdch\" (UID: \"adcdc585-f670-4bc3-be54-acf796d438df\") " pod="openstack/horizon-844c79cc9c-crdch" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.005516 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-ovsdbserver-sb\") pod \"dnsmasq-dns-c4fdd6b7-k2vj5\" (UID: \"c0c27a71-d721-4332-a7f2-827797a0b19f\") " pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.005628 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-credential-keys\") pod \"keystone-bootstrap-rb86p\" (UID: \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\") " pod="openstack/keystone-bootstrap-rb86p" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.005655 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-combined-ca-bundle\") pod \"keystone-bootstrap-rb86p\" (UID: \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\") " pod="openstack/keystone-bootstrap-rb86p" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.005678 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-ovsdbserver-nb\") pod \"dnsmasq-dns-c4fdd6b7-k2vj5\" (UID: \"c0c27a71-d721-4332-a7f2-827797a0b19f\") " pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.005699 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-config-data\") pod \"keystone-bootstrap-rb86p\" (UID: \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\") " pod="openstack/keystone-bootstrap-rb86p" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.005729 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adcdc585-f670-4bc3-be54-acf796d438df-logs\") pod \"horizon-844c79cc9c-crdch\" (UID: \"adcdc585-f670-4bc3-be54-acf796d438df\") " pod="openstack/horizon-844c79cc9c-crdch" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.005829 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-dns-swift-storage-0\") pod \"dnsmasq-dns-c4fdd6b7-k2vj5\" (UID: \"c0c27a71-d721-4332-a7f2-827797a0b19f\") " pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.005854 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adcdc585-f670-4bc3-be54-acf796d438df-scripts\") pod \"horizon-844c79cc9c-crdch\" (UID: \"adcdc585-f670-4bc3-be54-acf796d438df\") " pod="openstack/horizon-844c79cc9c-crdch" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.005906 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hc76\" (UniqueName: \"kubernetes.io/projected/adcdc585-f670-4bc3-be54-acf796d438df-kube-api-access-5hc76\") pod \"horizon-844c79cc9c-crdch\" (UID: \"adcdc585-f670-4bc3-be54-acf796d438df\") " pod="openstack/horizon-844c79cc9c-crdch" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.005943 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-config\") pod \"dnsmasq-dns-c4fdd6b7-k2vj5\" (UID: \"c0c27a71-d721-4332-a7f2-827797a0b19f\") " pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.005968 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-dns-svc\") pod \"dnsmasq-dns-c4fdd6b7-k2vj5\" (UID: \"c0c27a71-d721-4332-a7f2-827797a0b19f\") " pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.005990 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/adcdc585-f670-4bc3-be54-acf796d438df-config-data\") pod \"horizon-844c79cc9c-crdch\" (UID: \"adcdc585-f670-4bc3-be54-acf796d438df\") " pod="openstack/horizon-844c79cc9c-crdch" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.006011 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5jch\" (UniqueName: \"kubernetes.io/projected/c0c27a71-d721-4332-a7f2-827797a0b19f-kube-api-access-f5jch\") pod \"dnsmasq-dns-c4fdd6b7-k2vj5\" (UID: \"c0c27a71-d721-4332-a7f2-827797a0b19f\") " pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.006035 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-scripts\") pod \"keystone-bootstrap-rb86p\" (UID: \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\") " pod="openstack/keystone-bootstrap-rb86p" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.016351 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-credential-keys\") pod \"keystone-bootstrap-rb86p\" (UID: \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\") " pod="openstack/keystone-bootstrap-rb86p" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.016836 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-combined-ca-bundle\") pod \"keystone-bootstrap-rb86p\" (UID: \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\") " pod="openstack/keystone-bootstrap-rb86p" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.017315 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-fernet-keys\") pod \"keystone-bootstrap-rb86p\" (UID: \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\") " pod="openstack/keystone-bootstrap-rb86p" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.017396 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-zh2hd"] Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.021050 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-dns-svc\") pod \"dnsmasq-dns-c4fdd6b7-k2vj5\" (UID: \"c0c27a71-d721-4332-a7f2-827797a0b19f\") " pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.023797 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-ovsdbserver-nb\") pod \"dnsmasq-dns-c4fdd6b7-k2vj5\" (UID: \"c0c27a71-d721-4332-a7f2-827797a0b19f\") " pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.025140 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-dns-swift-storage-0\") pod \"dnsmasq-dns-c4fdd6b7-k2vj5\" (UID: \"c0c27a71-d721-4332-a7f2-827797a0b19f\") " pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.025417 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-ovsdbserver-sb\") pod \"dnsmasq-dns-c4fdd6b7-k2vj5\" (UID: \"c0c27a71-d721-4332-a7f2-827797a0b19f\") " pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.029117 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-scripts\") pod \"keystone-bootstrap-rb86p\" (UID: \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\") " pod="openstack/keystone-bootstrap-rb86p" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.034129 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-config\") pod \"dnsmasq-dns-c4fdd6b7-k2vj5\" (UID: \"c0c27a71-d721-4332-a7f2-827797a0b19f\") " pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.044523 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-config-data\") pod \"keystone-bootstrap-rb86p\" (UID: \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\") " pod="openstack/keystone-bootstrap-rb86p" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.074883 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5jch\" (UniqueName: \"kubernetes.io/projected/c0c27a71-d721-4332-a7f2-827797a0b19f-kube-api-access-f5jch\") pod \"dnsmasq-dns-c4fdd6b7-k2vj5\" (UID: \"c0c27a71-d721-4332-a7f2-827797a0b19f\") " pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.080388 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lznbm\" (UniqueName: \"kubernetes.io/projected/e920b3d7-6812-4953-8baf-e0039e7aa8a7-kube-api-access-lznbm\") pod \"keystone-bootstrap-rb86p\" (UID: \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\") " pod="openstack/keystone-bootstrap-rb86p" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.091861 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-lh5dl"] Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.092996 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lh5dl" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.094144 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rb86p" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.100986 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-lh5dl"] Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.101397 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-d9d6p" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.101625 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.101742 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.109597 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/adcdc585-f670-4bc3-be54-acf796d438df-config-data\") pod \"horizon-844c79cc9c-crdch\" (UID: \"adcdc585-f670-4bc3-be54-acf796d438df\") " pod="openstack/horizon-844c79cc9c-crdch" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.109651 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6eaa8789-cc44-4571-a25b-b7a7f56668f8-config\") pod \"neutron-db-sync-zh2hd\" (UID: \"6eaa8789-cc44-4571-a25b-b7a7f56668f8\") " pod="openstack/neutron-db-sync-zh2hd" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.109706 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/adcdc585-f670-4bc3-be54-acf796d438df-horizon-secret-key\") pod \"horizon-844c79cc9c-crdch\" (UID: \"adcdc585-f670-4bc3-be54-acf796d438df\") " pod="openstack/horizon-844c79cc9c-crdch" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.109750 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nw2dq\" (UniqueName: \"kubernetes.io/projected/6eaa8789-cc44-4571-a25b-b7a7f56668f8-kube-api-access-nw2dq\") pod \"neutron-db-sync-zh2hd\" (UID: \"6eaa8789-cc44-4571-a25b-b7a7f56668f8\") " pod="openstack/neutron-db-sync-zh2hd" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.109788 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adcdc585-f670-4bc3-be54-acf796d438df-logs\") pod \"horizon-844c79cc9c-crdch\" (UID: \"adcdc585-f670-4bc3-be54-acf796d438df\") " pod="openstack/horizon-844c79cc9c-crdch" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.109815 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eaa8789-cc44-4571-a25b-b7a7f56668f8-combined-ca-bundle\") pod \"neutron-db-sync-zh2hd\" (UID: \"6eaa8789-cc44-4571-a25b-b7a7f56668f8\") " pod="openstack/neutron-db-sync-zh2hd" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.109851 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adcdc585-f670-4bc3-be54-acf796d438df-scripts\") pod \"horizon-844c79cc9c-crdch\" (UID: \"adcdc585-f670-4bc3-be54-acf796d438df\") " pod="openstack/horizon-844c79cc9c-crdch" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.109877 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hc76\" (UniqueName: \"kubernetes.io/projected/adcdc585-f670-4bc3-be54-acf796d438df-kube-api-access-5hc76\") pod \"horizon-844c79cc9c-crdch\" (UID: \"adcdc585-f670-4bc3-be54-acf796d438df\") " pod="openstack/horizon-844c79cc9c-crdch" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.111103 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/adcdc585-f670-4bc3-be54-acf796d438df-config-data\") pod \"horizon-844c79cc9c-crdch\" (UID: \"adcdc585-f670-4bc3-be54-acf796d438df\") " pod="openstack/horizon-844c79cc9c-crdch" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.118154 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adcdc585-f670-4bc3-be54-acf796d438df-logs\") pod \"horizon-844c79cc9c-crdch\" (UID: \"adcdc585-f670-4bc3-be54-acf796d438df\") " pod="openstack/horizon-844c79cc9c-crdch" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.118683 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adcdc585-f670-4bc3-be54-acf796d438df-scripts\") pod \"horizon-844c79cc9c-crdch\" (UID: \"adcdc585-f670-4bc3-be54-acf796d438df\") " pod="openstack/horizon-844c79cc9c-crdch" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.125086 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/adcdc585-f670-4bc3-be54-acf796d438df-horizon-secret-key\") pod \"horizon-844c79cc9c-crdch\" (UID: \"adcdc585-f670-4bc3-be54-acf796d438df\") " pod="openstack/horizon-844c79cc9c-crdch" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.137405 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hc76\" (UniqueName: \"kubernetes.io/projected/adcdc585-f670-4bc3-be54-acf796d438df-kube-api-access-5hc76\") pod \"horizon-844c79cc9c-crdch\" (UID: \"adcdc585-f670-4bc3-be54-acf796d438df\") " pod="openstack/horizon-844c79cc9c-crdch" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.147736 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.156675 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.159530 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.159728 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.179834 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.195301 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-57bbdfc68f-j9b8f"] Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.196692 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57bbdfc68f-j9b8f" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.205129 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-57bbdfc68f-j9b8f"] Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.211948 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba8c67b5-216a-4f60-baee-c1b6211f89ec-config-data\") pod \"cinder-db-sync-lh5dl\" (UID: \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\") " pod="openstack/cinder-db-sync-lh5dl" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.212047 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6eaa8789-cc44-4571-a25b-b7a7f56668f8-config\") pod \"neutron-db-sync-zh2hd\" (UID: \"6eaa8789-cc44-4571-a25b-b7a7f56668f8\") " pod="openstack/neutron-db-sync-zh2hd" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.212102 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr7h8\" (UniqueName: \"kubernetes.io/projected/ba8c67b5-216a-4f60-baee-c1b6211f89ec-kube-api-access-rr7h8\") pod \"cinder-db-sync-lh5dl\" (UID: \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\") " pod="openstack/cinder-db-sync-lh5dl" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.212151 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nw2dq\" (UniqueName: \"kubernetes.io/projected/6eaa8789-cc44-4571-a25b-b7a7f56668f8-kube-api-access-nw2dq\") pod \"neutron-db-sync-zh2hd\" (UID: \"6eaa8789-cc44-4571-a25b-b7a7f56668f8\") " pod="openstack/neutron-db-sync-zh2hd" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.212172 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba8c67b5-216a-4f60-baee-c1b6211f89ec-scripts\") pod \"cinder-db-sync-lh5dl\" (UID: \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\") " pod="openstack/cinder-db-sync-lh5dl" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.212188 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba8c67b5-216a-4f60-baee-c1b6211f89ec-etc-machine-id\") pod \"cinder-db-sync-lh5dl\" (UID: \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\") " pod="openstack/cinder-db-sync-lh5dl" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.212208 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba8c67b5-216a-4f60-baee-c1b6211f89ec-combined-ca-bundle\") pod \"cinder-db-sync-lh5dl\" (UID: \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\") " pod="openstack/cinder-db-sync-lh5dl" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.212230 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eaa8789-cc44-4571-a25b-b7a7f56668f8-combined-ca-bundle\") pod \"neutron-db-sync-zh2hd\" (UID: \"6eaa8789-cc44-4571-a25b-b7a7f56668f8\") " pod="openstack/neutron-db-sync-zh2hd" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.212249 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ba8c67b5-216a-4f60-baee-c1b6211f89ec-db-sync-config-data\") pod \"cinder-db-sync-lh5dl\" (UID: \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\") " pod="openstack/cinder-db-sync-lh5dl" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.216134 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6eaa8789-cc44-4571-a25b-b7a7f56668f8-config\") pod \"neutron-db-sync-zh2hd\" (UID: \"6eaa8789-cc44-4571-a25b-b7a7f56668f8\") " pod="openstack/neutron-db-sync-zh2hd" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.220902 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-jq6l8"] Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.222226 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jq6l8" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.222356 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eaa8789-cc44-4571-a25b-b7a7f56668f8-combined-ca-bundle\") pod \"neutron-db-sync-zh2hd\" (UID: \"6eaa8789-cc44-4571-a25b-b7a7f56668f8\") " pod="openstack/neutron-db-sync-zh2hd" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.227398 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-nshcl" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.227592 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.228010 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-844c79cc9c-crdch" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.241432 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nw2dq\" (UniqueName: \"kubernetes.io/projected/6eaa8789-cc44-4571-a25b-b7a7f56668f8-kube-api-access-nw2dq\") pod \"neutron-db-sync-zh2hd\" (UID: \"6eaa8789-cc44-4571-a25b-b7a7f56668f8\") " pod="openstack/neutron-db-sync-zh2hd" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.248299 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-jq6l8"] Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.270120 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c4fdd6b7-k2vj5"] Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.270740 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.283163 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zh2hd" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.289030 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.290374 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.296109 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-v66vs" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.296846 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.296860 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.296978 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.303683 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.316252 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr7h8\" (UniqueName: \"kubernetes.io/projected/ba8c67b5-216a-4f60-baee-c1b6211f89ec-kube-api-access-rr7h8\") pod \"cinder-db-sync-lh5dl\" (UID: \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\") " pod="openstack/cinder-db-sync-lh5dl" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.316290 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19facc80-e9df-42dc-8124-7619b2167b5c-config-data\") pod \"ceilometer-0\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " pod="openstack/ceilometer-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.316315 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19facc80-e9df-42dc-8124-7619b2167b5c-scripts\") pod \"ceilometer-0\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " pod="openstack/ceilometer-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.316367 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45bz8\" (UniqueName: \"kubernetes.io/projected/19facc80-e9df-42dc-8124-7619b2167b5c-kube-api-access-45bz8\") pod \"ceilometer-0\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " pod="openstack/ceilometer-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.316387 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19facc80-e9df-42dc-8124-7619b2167b5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " pod="openstack/ceilometer-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.316419 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba8c67b5-216a-4f60-baee-c1b6211f89ec-scripts\") pod \"cinder-db-sync-lh5dl\" (UID: \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\") " pod="openstack/cinder-db-sync-lh5dl" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.316438 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba8c67b5-216a-4f60-baee-c1b6211f89ec-etc-machine-id\") pod \"cinder-db-sync-lh5dl\" (UID: \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\") " pod="openstack/cinder-db-sync-lh5dl" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.316457 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d790c2ab-67aa-4c46-9407-9fa991223dd0-config-data\") pod \"horizon-57bbdfc68f-j9b8f\" (UID: \"d790c2ab-67aa-4c46-9407-9fa991223dd0\") " pod="openstack/horizon-57bbdfc68f-j9b8f" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.316474 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba8c67b5-216a-4f60-baee-c1b6211f89ec-combined-ca-bundle\") pod \"cinder-db-sync-lh5dl\" (UID: \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\") " pod="openstack/cinder-db-sync-lh5dl" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.316497 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d790c2ab-67aa-4c46-9407-9fa991223dd0-horizon-secret-key\") pod \"horizon-57bbdfc68f-j9b8f\" (UID: \"d790c2ab-67aa-4c46-9407-9fa991223dd0\") " pod="openstack/horizon-57bbdfc68f-j9b8f" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.316513 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ba8c67b5-216a-4f60-baee-c1b6211f89ec-db-sync-config-data\") pod \"cinder-db-sync-lh5dl\" (UID: \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\") " pod="openstack/cinder-db-sync-lh5dl" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.316531 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d790c2ab-67aa-4c46-9407-9fa991223dd0-logs\") pod \"horizon-57bbdfc68f-j9b8f\" (UID: \"d790c2ab-67aa-4c46-9407-9fa991223dd0\") " pod="openstack/horizon-57bbdfc68f-j9b8f" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.316556 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d24gr\" (UniqueName: \"kubernetes.io/projected/dcb8e782-fe60-4c41-a843-1980fc8ab3cc-kube-api-access-d24gr\") pod \"barbican-db-sync-jq6l8\" (UID: \"dcb8e782-fe60-4c41-a843-1980fc8ab3cc\") " pod="openstack/barbican-db-sync-jq6l8" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.316573 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d790c2ab-67aa-4c46-9407-9fa991223dd0-scripts\") pod \"horizon-57bbdfc68f-j9b8f\" (UID: \"d790c2ab-67aa-4c46-9407-9fa991223dd0\") " pod="openstack/horizon-57bbdfc68f-j9b8f" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.316611 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb8e782-fe60-4c41-a843-1980fc8ab3cc-combined-ca-bundle\") pod \"barbican-db-sync-jq6l8\" (UID: \"dcb8e782-fe60-4c41-a843-1980fc8ab3cc\") " pod="openstack/barbican-db-sync-jq6l8" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.316631 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19facc80-e9df-42dc-8124-7619b2167b5c-run-httpd\") pod \"ceilometer-0\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " pod="openstack/ceilometer-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.316650 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba8c67b5-216a-4f60-baee-c1b6211f89ec-config-data\") pod \"cinder-db-sync-lh5dl\" (UID: \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\") " pod="openstack/cinder-db-sync-lh5dl" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.316666 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19facc80-e9df-42dc-8124-7619b2167b5c-log-httpd\") pod \"ceilometer-0\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " pod="openstack/ceilometer-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.316682 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19facc80-e9df-42dc-8124-7619b2167b5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " pod="openstack/ceilometer-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.316707 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdj88\" (UniqueName: \"kubernetes.io/projected/d790c2ab-67aa-4c46-9407-9fa991223dd0-kube-api-access-mdj88\") pod \"horizon-57bbdfc68f-j9b8f\" (UID: \"d790c2ab-67aa-4c46-9407-9fa991223dd0\") " pod="openstack/horizon-57bbdfc68f-j9b8f" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.316724 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dcb8e782-fe60-4c41-a843-1980fc8ab3cc-db-sync-config-data\") pod \"barbican-db-sync-jq6l8\" (UID: \"dcb8e782-fe60-4c41-a843-1980fc8ab3cc\") " pod="openstack/barbican-db-sync-jq6l8" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.318012 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba8c67b5-216a-4f60-baee-c1b6211f89ec-etc-machine-id\") pod \"cinder-db-sync-lh5dl\" (UID: \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\") " pod="openstack/cinder-db-sync-lh5dl" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.322300 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba8c67b5-216a-4f60-baee-c1b6211f89ec-scripts\") pod \"cinder-db-sync-lh5dl\" (UID: \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\") " pod="openstack/cinder-db-sync-lh5dl" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.326268 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ba8c67b5-216a-4f60-baee-c1b6211f89ec-db-sync-config-data\") pod \"cinder-db-sync-lh5dl\" (UID: \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\") " pod="openstack/cinder-db-sync-lh5dl" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.327050 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba8c67b5-216a-4f60-baee-c1b6211f89ec-config-data\") pod \"cinder-db-sync-lh5dl\" (UID: \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\") " pod="openstack/cinder-db-sync-lh5dl" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.330102 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba8c67b5-216a-4f60-baee-c1b6211f89ec-combined-ca-bundle\") pod \"cinder-db-sync-lh5dl\" (UID: \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\") " pod="openstack/cinder-db-sync-lh5dl" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.338086 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69c85d5ff7-758zs"] Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.339403 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.366899 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr7h8\" (UniqueName: \"kubernetes.io/projected/ba8c67b5-216a-4f60-baee-c1b6211f89ec-kube-api-access-rr7h8\") pod \"cinder-db-sync-lh5dl\" (UID: \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\") " pod="openstack/cinder-db-sync-lh5dl" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.376318 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69c85d5ff7-758zs"] Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.377426 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-ptthx"] Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.379586 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ptthx" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.382920 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7stp8" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.383069 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.383289 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.407093 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ptthx"] Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.418611 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-logs\") pod \"glance-default-internal-api-0\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.419063 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19facc80-e9df-42dc-8124-7619b2167b5c-log-httpd\") pod \"ceilometer-0\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " pod="openstack/ceilometer-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.419091 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19facc80-e9df-42dc-8124-7619b2167b5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " pod="openstack/ceilometer-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.419130 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdj88\" (UniqueName: \"kubernetes.io/projected/d790c2ab-67aa-4c46-9407-9fa991223dd0-kube-api-access-mdj88\") pod \"horizon-57bbdfc68f-j9b8f\" (UID: \"d790c2ab-67aa-4c46-9407-9fa991223dd0\") " pod="openstack/horizon-57bbdfc68f-j9b8f" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.419154 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dcb8e782-fe60-4c41-a843-1980fc8ab3cc-db-sync-config-data\") pod \"barbican-db-sync-jq6l8\" (UID: \"dcb8e782-fe60-4c41-a843-1980fc8ab3cc\") " pod="openstack/barbican-db-sync-jq6l8" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.419173 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.419207 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.419228 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.419246 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19facc80-e9df-42dc-8124-7619b2167b5c-config-data\") pod \"ceilometer-0\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " pod="openstack/ceilometer-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.419276 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19facc80-e9df-42dc-8124-7619b2167b5c-scripts\") pod \"ceilometer-0\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " pod="openstack/ceilometer-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.419298 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.419322 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7s5n\" (UniqueName: \"kubernetes.io/projected/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-kube-api-access-g7s5n\") pod \"glance-default-internal-api-0\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.419346 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45bz8\" (UniqueName: \"kubernetes.io/projected/19facc80-e9df-42dc-8124-7619b2167b5c-kube-api-access-45bz8\") pod \"ceilometer-0\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " pod="openstack/ceilometer-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.419370 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19facc80-e9df-42dc-8124-7619b2167b5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " pod="openstack/ceilometer-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.419410 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.419430 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d790c2ab-67aa-4c46-9407-9fa991223dd0-config-data\") pod \"horizon-57bbdfc68f-j9b8f\" (UID: \"d790c2ab-67aa-4c46-9407-9fa991223dd0\") " pod="openstack/horizon-57bbdfc68f-j9b8f" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.419457 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d790c2ab-67aa-4c46-9407-9fa991223dd0-horizon-secret-key\") pod \"horizon-57bbdfc68f-j9b8f\" (UID: \"d790c2ab-67aa-4c46-9407-9fa991223dd0\") " pod="openstack/horizon-57bbdfc68f-j9b8f" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.419484 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d790c2ab-67aa-4c46-9407-9fa991223dd0-logs\") pod \"horizon-57bbdfc68f-j9b8f\" (UID: \"d790c2ab-67aa-4c46-9407-9fa991223dd0\") " pod="openstack/horizon-57bbdfc68f-j9b8f" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.419511 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d24gr\" (UniqueName: \"kubernetes.io/projected/dcb8e782-fe60-4c41-a843-1980fc8ab3cc-kube-api-access-d24gr\") pod \"barbican-db-sync-jq6l8\" (UID: \"dcb8e782-fe60-4c41-a843-1980fc8ab3cc\") " pod="openstack/barbican-db-sync-jq6l8" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.419537 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d790c2ab-67aa-4c46-9407-9fa991223dd0-scripts\") pod \"horizon-57bbdfc68f-j9b8f\" (UID: \"d790c2ab-67aa-4c46-9407-9fa991223dd0\") " pod="openstack/horizon-57bbdfc68f-j9b8f" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.419558 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.419605 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb8e782-fe60-4c41-a843-1980fc8ab3cc-combined-ca-bundle\") pod \"barbican-db-sync-jq6l8\" (UID: \"dcb8e782-fe60-4c41-a843-1980fc8ab3cc\") " pod="openstack/barbican-db-sync-jq6l8" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.419626 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19facc80-e9df-42dc-8124-7619b2167b5c-run-httpd\") pod \"ceilometer-0\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " pod="openstack/ceilometer-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.420075 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19facc80-e9df-42dc-8124-7619b2167b5c-run-httpd\") pod \"ceilometer-0\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " pod="openstack/ceilometer-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.420357 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19facc80-e9df-42dc-8124-7619b2167b5c-log-httpd\") pod \"ceilometer-0\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " pod="openstack/ceilometer-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.427069 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d790c2ab-67aa-4c46-9407-9fa991223dd0-logs\") pod \"horizon-57bbdfc68f-j9b8f\" (UID: \"d790c2ab-67aa-4c46-9407-9fa991223dd0\") " pod="openstack/horizon-57bbdfc68f-j9b8f" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.428258 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dcb8e782-fe60-4c41-a843-1980fc8ab3cc-db-sync-config-data\") pod \"barbican-db-sync-jq6l8\" (UID: \"dcb8e782-fe60-4c41-a843-1980fc8ab3cc\") " pod="openstack/barbican-db-sync-jq6l8" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.431665 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d790c2ab-67aa-4c46-9407-9fa991223dd0-scripts\") pod \"horizon-57bbdfc68f-j9b8f\" (UID: \"d790c2ab-67aa-4c46-9407-9fa991223dd0\") " pod="openstack/horizon-57bbdfc68f-j9b8f" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.432402 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d790c2ab-67aa-4c46-9407-9fa991223dd0-config-data\") pod \"horizon-57bbdfc68f-j9b8f\" (UID: \"d790c2ab-67aa-4c46-9407-9fa991223dd0\") " pod="openstack/horizon-57bbdfc68f-j9b8f" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.437601 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19facc80-e9df-42dc-8124-7619b2167b5c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " pod="openstack/ceilometer-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.440162 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d790c2ab-67aa-4c46-9407-9fa991223dd0-horizon-secret-key\") pod \"horizon-57bbdfc68f-j9b8f\" (UID: \"d790c2ab-67aa-4c46-9407-9fa991223dd0\") " pod="openstack/horizon-57bbdfc68f-j9b8f" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.440322 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb8e782-fe60-4c41-a843-1980fc8ab3cc-combined-ca-bundle\") pod \"barbican-db-sync-jq6l8\" (UID: \"dcb8e782-fe60-4c41-a843-1980fc8ab3cc\") " pod="openstack/barbican-db-sync-jq6l8" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.440705 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19facc80-e9df-42dc-8124-7619b2167b5c-config-data\") pod \"ceilometer-0\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " pod="openstack/ceilometer-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.444626 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45bz8\" (UniqueName: \"kubernetes.io/projected/19facc80-e9df-42dc-8124-7619b2167b5c-kube-api-access-45bz8\") pod \"ceilometer-0\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " pod="openstack/ceilometer-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.444800 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.445656 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19facc80-e9df-42dc-8124-7619b2167b5c-scripts\") pod \"ceilometer-0\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " pod="openstack/ceilometer-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.445143 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19facc80-e9df-42dc-8124-7619b2167b5c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " pod="openstack/ceilometer-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.447326 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdj88\" (UniqueName: \"kubernetes.io/projected/d790c2ab-67aa-4c46-9407-9fa991223dd0-kube-api-access-mdj88\") pod \"horizon-57bbdfc68f-j9b8f\" (UID: \"d790c2ab-67aa-4c46-9407-9fa991223dd0\") " pod="openstack/horizon-57bbdfc68f-j9b8f" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.449952 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d24gr\" (UniqueName: \"kubernetes.io/projected/dcb8e782-fe60-4c41-a843-1980fc8ab3cc-kube-api-access-d24gr\") pod \"barbican-db-sync-jq6l8\" (UID: \"dcb8e782-fe60-4c41-a843-1980fc8ab3cc\") " pod="openstack/barbican-db-sync-jq6l8" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.450709 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.455487 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.455652 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.466014 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.520115 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lh5dl" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.520536 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-dns-svc\") pod \"dnsmasq-dns-69c85d5ff7-758zs\" (UID: \"5a53a69c-528b-4df5-9502-69316c3f345d\") " pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.520581 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-ovsdbserver-sb\") pod \"dnsmasq-dns-69c85d5ff7-758zs\" (UID: \"5a53a69c-528b-4df5-9502-69316c3f345d\") " pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.520609 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.520655 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njvmh\" (UniqueName: \"kubernetes.io/projected/9e0d87d6-fd65-4565-971f-070050f2f9ff-kube-api-access-njvmh\") pod \"placement-db-sync-ptthx\" (UID: \"9e0d87d6-fd65-4565-971f-070050f2f9ff\") " pod="openstack/placement-db-sync-ptthx" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.520679 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-ovsdbserver-nb\") pod \"dnsmasq-dns-69c85d5ff7-758zs\" (UID: \"5a53a69c-528b-4df5-9502-69316c3f345d\") " pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.520704 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-logs\") pod \"glance-default-internal-api-0\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.520739 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e0d87d6-fd65-4565-971f-070050f2f9ff-config-data\") pod \"placement-db-sync-ptthx\" (UID: \"9e0d87d6-fd65-4565-971f-070050f2f9ff\") " pod="openstack/placement-db-sync-ptthx" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.520787 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.520819 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.520844 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.520865 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-config\") pod \"dnsmasq-dns-69c85d5ff7-758zs\" (UID: \"5a53a69c-528b-4df5-9502-69316c3f345d\") " pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.520901 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e0d87d6-fd65-4565-971f-070050f2f9ff-combined-ca-bundle\") pod \"placement-db-sync-ptthx\" (UID: \"9e0d87d6-fd65-4565-971f-070050f2f9ff\") " pod="openstack/placement-db-sync-ptthx" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.520935 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.520959 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-dns-swift-storage-0\") pod \"dnsmasq-dns-69c85d5ff7-758zs\" (UID: \"5a53a69c-528b-4df5-9502-69316c3f345d\") " pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.520978 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e0d87d6-fd65-4565-971f-070050f2f9ff-logs\") pod \"placement-db-sync-ptthx\" (UID: \"9e0d87d6-fd65-4565-971f-070050f2f9ff\") " pod="openstack/placement-db-sync-ptthx" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.520999 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7s5n\" (UniqueName: \"kubernetes.io/projected/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-kube-api-access-g7s5n\") pod \"glance-default-internal-api-0\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.521017 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v5bt\" (UniqueName: \"kubernetes.io/projected/5a53a69c-528b-4df5-9502-69316c3f345d-kube-api-access-4v5bt\") pod \"dnsmasq-dns-69c85d5ff7-758zs\" (UID: \"5a53a69c-528b-4df5-9502-69316c3f345d\") " pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.521064 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.521088 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e0d87d6-fd65-4565-971f-070050f2f9ff-scripts\") pod \"placement-db-sync-ptthx\" (UID: \"9e0d87d6-fd65-4565-971f-070050f2f9ff\") " pod="openstack/placement-db-sync-ptthx" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.521819 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.524187 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.524470 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-logs\") pod \"glance-default-internal-api-0\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.526353 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.526921 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.528379 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.539837 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7s5n\" (UniqueName: \"kubernetes.io/projected/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-kube-api-access-g7s5n\") pod \"glance-default-internal-api-0\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.548416 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.551435 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.572717 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.574905 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57bbdfc68f-j9b8f" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.604965 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jq6l8" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.622033 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e0d87d6-fd65-4565-971f-070050f2f9ff-config-data\") pod \"placement-db-sync-ptthx\" (UID: \"9e0d87d6-fd65-4565-971f-070050f2f9ff\") " pod="openstack/placement-db-sync-ptthx" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.622085 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxwk7\" (UniqueName: \"kubernetes.io/projected/794ee8e3-ecee-4960-9676-c11a1bf988d1-kube-api-access-xxwk7\") pod \"glance-default-external-api-0\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.622118 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-config\") pod \"dnsmasq-dns-69c85d5ff7-758zs\" (UID: \"5a53a69c-528b-4df5-9502-69316c3f345d\") " pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.622149 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/794ee8e3-ecee-4960-9676-c11a1bf988d1-logs\") pod \"glance-default-external-api-0\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.622166 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e0d87d6-fd65-4565-971f-070050f2f9ff-combined-ca-bundle\") pod \"placement-db-sync-ptthx\" (UID: \"9e0d87d6-fd65-4565-971f-070050f2f9ff\") " pod="openstack/placement-db-sync-ptthx" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.622235 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/794ee8e3-ecee-4960-9676-c11a1bf988d1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.622289 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-dns-swift-storage-0\") pod \"dnsmasq-dns-69c85d5ff7-758zs\" (UID: \"5a53a69c-528b-4df5-9502-69316c3f345d\") " pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.622328 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e0d87d6-fd65-4565-971f-070050f2f9ff-logs\") pod \"placement-db-sync-ptthx\" (UID: \"9e0d87d6-fd65-4565-971f-070050f2f9ff\") " pod="openstack/placement-db-sync-ptthx" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.622356 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v5bt\" (UniqueName: \"kubernetes.io/projected/5a53a69c-528b-4df5-9502-69316c3f345d-kube-api-access-4v5bt\") pod \"dnsmasq-dns-69c85d5ff7-758zs\" (UID: \"5a53a69c-528b-4df5-9502-69316c3f345d\") " pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.622395 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/794ee8e3-ecee-4960-9676-c11a1bf988d1-scripts\") pod \"glance-default-external-api-0\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.622465 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794ee8e3-ecee-4960-9676-c11a1bf988d1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.622494 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794ee8e3-ecee-4960-9676-c11a1bf988d1-config-data\") pod \"glance-default-external-api-0\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.622521 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/794ee8e3-ecee-4960-9676-c11a1bf988d1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.622548 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e0d87d6-fd65-4565-971f-070050f2f9ff-scripts\") pod \"placement-db-sync-ptthx\" (UID: \"9e0d87d6-fd65-4565-971f-070050f2f9ff\") " pod="openstack/placement-db-sync-ptthx" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.622578 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-dns-svc\") pod \"dnsmasq-dns-69c85d5ff7-758zs\" (UID: \"5a53a69c-528b-4df5-9502-69316c3f345d\") " pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.622638 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-ovsdbserver-sb\") pod \"dnsmasq-dns-69c85d5ff7-758zs\" (UID: \"5a53a69c-528b-4df5-9502-69316c3f345d\") " pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.622708 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.622729 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njvmh\" (UniqueName: \"kubernetes.io/projected/9e0d87d6-fd65-4565-971f-070050f2f9ff-kube-api-access-njvmh\") pod \"placement-db-sync-ptthx\" (UID: \"9e0d87d6-fd65-4565-971f-070050f2f9ff\") " pod="openstack/placement-db-sync-ptthx" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.622741 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.622790 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-ovsdbserver-nb\") pod \"dnsmasq-dns-69c85d5ff7-758zs\" (UID: \"5a53a69c-528b-4df5-9502-69316c3f345d\") " pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.623318 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-config\") pod \"dnsmasq-dns-69c85d5ff7-758zs\" (UID: \"5a53a69c-528b-4df5-9502-69316c3f345d\") " pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.623583 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-ovsdbserver-nb\") pod \"dnsmasq-dns-69c85d5ff7-758zs\" (UID: \"5a53a69c-528b-4df5-9502-69316c3f345d\") " pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.623874 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-dns-swift-storage-0\") pod \"dnsmasq-dns-69c85d5ff7-758zs\" (UID: \"5a53a69c-528b-4df5-9502-69316c3f345d\") " pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.623967 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-dns-svc\") pod \"dnsmasq-dns-69c85d5ff7-758zs\" (UID: \"5a53a69c-528b-4df5-9502-69316c3f345d\") " pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.624166 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e0d87d6-fd65-4565-971f-070050f2f9ff-logs\") pod \"placement-db-sync-ptthx\" (UID: \"9e0d87d6-fd65-4565-971f-070050f2f9ff\") " pod="openstack/placement-db-sync-ptthx" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.624172 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-ovsdbserver-sb\") pod \"dnsmasq-dns-69c85d5ff7-758zs\" (UID: \"5a53a69c-528b-4df5-9502-69316c3f345d\") " pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.632110 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e0d87d6-fd65-4565-971f-070050f2f9ff-config-data\") pod \"placement-db-sync-ptthx\" (UID: \"9e0d87d6-fd65-4565-971f-070050f2f9ff\") " pod="openstack/placement-db-sync-ptthx" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.632147 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e0d87d6-fd65-4565-971f-070050f2f9ff-combined-ca-bundle\") pod \"placement-db-sync-ptthx\" (UID: \"9e0d87d6-fd65-4565-971f-070050f2f9ff\") " pod="openstack/placement-db-sync-ptthx" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.632369 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e0d87d6-fd65-4565-971f-070050f2f9ff-scripts\") pod \"placement-db-sync-ptthx\" (UID: \"9e0d87d6-fd65-4565-971f-070050f2f9ff\") " pod="openstack/placement-db-sync-ptthx" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.645630 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v5bt\" (UniqueName: \"kubernetes.io/projected/5a53a69c-528b-4df5-9502-69316c3f345d-kube-api-access-4v5bt\") pod \"dnsmasq-dns-69c85d5ff7-758zs\" (UID: \"5a53a69c-528b-4df5-9502-69316c3f345d\") " pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.650201 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njvmh\" (UniqueName: \"kubernetes.io/projected/9e0d87d6-fd65-4565-971f-070050f2f9ff-kube-api-access-njvmh\") pod \"placement-db-sync-ptthx\" (UID: \"9e0d87d6-fd65-4565-971f-070050f2f9ff\") " pod="openstack/placement-db-sync-ptthx" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.685159 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.709303 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ptthx" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.727192 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.727304 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxwk7\" (UniqueName: \"kubernetes.io/projected/794ee8e3-ecee-4960-9676-c11a1bf988d1-kube-api-access-xxwk7\") pod \"glance-default-external-api-0\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.727362 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/794ee8e3-ecee-4960-9676-c11a1bf988d1-logs\") pod \"glance-default-external-api-0\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.727380 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/794ee8e3-ecee-4960-9676-c11a1bf988d1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.727415 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/794ee8e3-ecee-4960-9676-c11a1bf988d1-scripts\") pod \"glance-default-external-api-0\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.727457 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794ee8e3-ecee-4960-9676-c11a1bf988d1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.727461 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.728481 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/794ee8e3-ecee-4960-9676-c11a1bf988d1-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.728734 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/794ee8e3-ecee-4960-9676-c11a1bf988d1-logs\") pod \"glance-default-external-api-0\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.727481 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794ee8e3-ecee-4960-9676-c11a1bf988d1-config-data\") pod \"glance-default-external-api-0\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.731025 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/794ee8e3-ecee-4960-9676-c11a1bf988d1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.747626 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794ee8e3-ecee-4960-9676-c11a1bf988d1-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.748624 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/794ee8e3-ecee-4960-9676-c11a1bf988d1-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.760748 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/794ee8e3-ecee-4960-9676-c11a1bf988d1-scripts\") pod \"glance-default-external-api-0\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.761886 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794ee8e3-ecee-4960-9676-c11a1bf988d1-config-data\") pod \"glance-default-external-api-0\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.785956 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxwk7\" (UniqueName: \"kubernetes.io/projected/794ee8e3-ecee-4960-9676-c11a1bf988d1-kube-api-access-xxwk7\") pod \"glance-default-external-api-0\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.813464 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rb86p"] Feb 17 09:21:22 crc kubenswrapper[4848]: I0217 09:21:22.822314 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.081024 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.084330 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-844c79cc9c-crdch"] Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.104157 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-zh2hd"] Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.117354 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c4fdd6b7-k2vj5"] Feb 17 09:21:23 crc kubenswrapper[4848]: W0217 09:21:23.137387 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6eaa8789_cc44_4571_a25b_b7a7f56668f8.slice/crio-6b4fda0c707564e395863d613f42aec4e31179938891c84dd3104b25dac40c89 WatchSource:0}: Error finding container 6b4fda0c707564e395863d613f42aec4e31179938891c84dd3104b25dac40c89: Status 404 returned error can't find the container with id 6b4fda0c707564e395863d613f42aec4e31179938891c84dd3104b25dac40c89 Feb 17 09:21:23 crc kubenswrapper[4848]: W0217 09:21:23.147237 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0c27a71_d721_4332_a7f2_827797a0b19f.slice/crio-18ab747e7fee14bc6b38ebbb56016bd72daab0f55bd778eb30b1e46b76629853 WatchSource:0}: Error finding container 18ab747e7fee14bc6b38ebbb56016bd72daab0f55bd778eb30b1e46b76629853: Status 404 returned error can't find the container with id 18ab747e7fee14bc6b38ebbb56016bd72daab0f55bd778eb30b1e46b76629853 Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.193568 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.220452 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-lh5dl"] Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.251439 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-57bbdfc68f-j9b8f"] Feb 17 09:21:23 crc kubenswrapper[4848]: W0217 09:21:23.259153 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd790c2ab_67aa_4c46_9407_9fa991223dd0.slice/crio-834cca69968e35eaae5f0294086e2e803f3a83063138d1a657c7a2dabcdbb3f1 WatchSource:0}: Error finding container 834cca69968e35eaae5f0294086e2e803f3a83063138d1a657c7a2dabcdbb3f1: Status 404 returned error can't find the container with id 834cca69968e35eaae5f0294086e2e803f3a83063138d1a657c7a2dabcdbb3f1 Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.283637 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-nvhmn" Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.283687 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-nvhmn" Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.351966 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-nvhmn" Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.425342 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-jq6l8"] Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.526077 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57bbdfc68f-j9b8f" event={"ID":"d790c2ab-67aa-4c46-9407-9fa991223dd0","Type":"ContainerStarted","Data":"834cca69968e35eaae5f0294086e2e803f3a83063138d1a657c7a2dabcdbb3f1"} Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.551455 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zh2hd" event={"ID":"6eaa8789-cc44-4571-a25b-b7a7f56668f8","Type":"ContainerStarted","Data":"6074f51855bf7d05823997c5ea7977da776d02f7b85e02c3d39e1c949344f06b"} Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.551506 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zh2hd" event={"ID":"6eaa8789-cc44-4571-a25b-b7a7f56668f8","Type":"ContainerStarted","Data":"6b4fda0c707564e395863d613f42aec4e31179938891c84dd3104b25dac40c89"} Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.573050 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rb86p" event={"ID":"e920b3d7-6812-4953-8baf-e0039e7aa8a7","Type":"ContainerStarted","Data":"a756de62b5a6db3232fab62b0e393c42a2f0617bf011c488351948bf3ac8cc14"} Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.573090 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rb86p" event={"ID":"e920b3d7-6812-4953-8baf-e0039e7aa8a7","Type":"ContainerStarted","Data":"b2c312ef0b29f439de77943f585f9d85ac68c34e72f9f9b0f0c7e4b6a1dd7a84"} Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.587579 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844c79cc9c-crdch" event={"ID":"adcdc585-f670-4bc3-be54-acf796d438df","Type":"ContainerStarted","Data":"ea40469db3244034845afbf0ff0731b52963ca757a73f28346fe2fe701edaaa5"} Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.611115 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19facc80-e9df-42dc-8124-7619b2167b5c","Type":"ContainerStarted","Data":"e2c00f7696e586e0bd4794022b2ae6dbd600c53ad91c67947bdaaf029dfc8b03"} Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.626636 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lh5dl" event={"ID":"ba8c67b5-216a-4f60-baee-c1b6211f89ec","Type":"ContainerStarted","Data":"9ffb7dcd3c99ade820dfc37ea257ff6575cd1dc4c0bd6ae46857933d256e0926"} Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.631650 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jq6l8" event={"ID":"dcb8e782-fe60-4c41-a843-1980fc8ab3cc","Type":"ContainerStarted","Data":"45f640070fa3ef2f5dda9892b6cb7dd122c44c384928d27ce8e0aa46dea0cfef"} Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.646020 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-ptthx"] Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.666879 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" event={"ID":"c0c27a71-d721-4332-a7f2-827797a0b19f","Type":"ContainerStarted","Data":"586688665afaef707c98cbbdea170ba8dad906394df28084ff6b92626b124410"} Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.666912 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" event={"ID":"c0c27a71-d721-4332-a7f2-827797a0b19f","Type":"ContainerStarted","Data":"18ab747e7fee14bc6b38ebbb56016bd72daab0f55bd778eb30b1e46b76629853"} Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.834684 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69c85d5ff7-758zs"] Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.852866 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rb86p" podStartSLOduration=2.85284832 podStartE2EDuration="2.85284832s" podCreationTimestamp="2026-02-17 09:21:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:21:23.852090658 +0000 UTC m=+961.395346294" watchObservedRunningTime="2026-02-17 09:21:23.85284832 +0000 UTC m=+961.396103966" Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.906544 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-zh2hd" podStartSLOduration=2.9065264490000002 podStartE2EDuration="2.906526449s" podCreationTimestamp="2026-02-17 09:21:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:21:23.887166977 +0000 UTC m=+961.430422633" watchObservedRunningTime="2026-02-17 09:21:23.906526449 +0000 UTC m=+961.449782095" Feb 17 09:21:23 crc kubenswrapper[4848]: I0217 09:21:23.972199 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.306263 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.391067 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-dns-svc\") pod \"c0c27a71-d721-4332-a7f2-827797a0b19f\" (UID: \"c0c27a71-d721-4332-a7f2-827797a0b19f\") " Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.391250 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5jch\" (UniqueName: \"kubernetes.io/projected/c0c27a71-d721-4332-a7f2-827797a0b19f-kube-api-access-f5jch\") pod \"c0c27a71-d721-4332-a7f2-827797a0b19f\" (UID: \"c0c27a71-d721-4332-a7f2-827797a0b19f\") " Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.391291 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-ovsdbserver-nb\") pod \"c0c27a71-d721-4332-a7f2-827797a0b19f\" (UID: \"c0c27a71-d721-4332-a7f2-827797a0b19f\") " Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.391307 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-config\") pod \"c0c27a71-d721-4332-a7f2-827797a0b19f\" (UID: \"c0c27a71-d721-4332-a7f2-827797a0b19f\") " Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.391325 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-dns-swift-storage-0\") pod \"c0c27a71-d721-4332-a7f2-827797a0b19f\" (UID: \"c0c27a71-d721-4332-a7f2-827797a0b19f\") " Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.391361 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-ovsdbserver-sb\") pod \"c0c27a71-d721-4332-a7f2-827797a0b19f\" (UID: \"c0c27a71-d721-4332-a7f2-827797a0b19f\") " Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.432890 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0c27a71-d721-4332-a7f2-827797a0b19f-kube-api-access-f5jch" (OuterVolumeSpecName: "kube-api-access-f5jch") pod "c0c27a71-d721-4332-a7f2-827797a0b19f" (UID: "c0c27a71-d721-4332-a7f2-827797a0b19f"). InnerVolumeSpecName "kube-api-access-f5jch". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.444357 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c0c27a71-d721-4332-a7f2-827797a0b19f" (UID: "c0c27a71-d721-4332-a7f2-827797a0b19f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.455371 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c0c27a71-d721-4332-a7f2-827797a0b19f" (UID: "c0c27a71-d721-4332-a7f2-827797a0b19f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.456526 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c0c27a71-d721-4332-a7f2-827797a0b19f" (UID: "c0c27a71-d721-4332-a7f2-827797a0b19f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.467755 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-config" (OuterVolumeSpecName: "config") pod "c0c27a71-d721-4332-a7f2-827797a0b19f" (UID: "c0c27a71-d721-4332-a7f2-827797a0b19f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.486428 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c0c27a71-d721-4332-a7f2-827797a0b19f" (UID: "c0c27a71-d721-4332-a7f2-827797a0b19f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.507267 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.508429 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.508451 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5jch\" (UniqueName: \"kubernetes.io/projected/c0c27a71-d721-4332-a7f2-827797a0b19f-kube-api-access-f5jch\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.508464 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.508475 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.508484 4848 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.508492 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c0c27a71-d721-4332-a7f2-827797a0b19f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.545481 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-57bbdfc68f-j9b8f"] Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.554039 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.580899 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-775dd6779c-j9k6n"] Feb 17 09:21:24 crc kubenswrapper[4848]: E0217 09:21:24.581342 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0c27a71-d721-4332-a7f2-827797a0b19f" containerName="init" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.581360 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0c27a71-d721-4332-a7f2-827797a0b19f" containerName="init" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.581529 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0c27a71-d721-4332-a7f2-827797a0b19f" containerName="init" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.582490 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-775dd6779c-j9k6n" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.589335 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.598808 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-775dd6779c-j9k6n"] Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.607987 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.612497 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8226a25-977f-4934-b40b-7504ab8f23e4-logs\") pod \"horizon-775dd6779c-j9k6n\" (UID: \"f8226a25-977f-4934-b40b-7504ab8f23e4\") " pod="openstack/horizon-775dd6779c-j9k6n" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.612782 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8226a25-977f-4934-b40b-7504ab8f23e4-scripts\") pod \"horizon-775dd6779c-j9k6n\" (UID: \"f8226a25-977f-4934-b40b-7504ab8f23e4\") " pod="openstack/horizon-775dd6779c-j9k6n" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.612904 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8226a25-977f-4934-b40b-7504ab8f23e4-config-data\") pod \"horizon-775dd6779c-j9k6n\" (UID: \"f8226a25-977f-4934-b40b-7504ab8f23e4\") " pod="openstack/horizon-775dd6779c-j9k6n" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.613030 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f8226a25-977f-4934-b40b-7504ab8f23e4-horizon-secret-key\") pod \"horizon-775dd6779c-j9k6n\" (UID: \"f8226a25-977f-4934-b40b-7504ab8f23e4\") " pod="openstack/horizon-775dd6779c-j9k6n" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.613146 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fk6fp\" (UniqueName: \"kubernetes.io/projected/f8226a25-977f-4934-b40b-7504ab8f23e4-kube-api-access-fk6fp\") pod \"horizon-775dd6779c-j9k6n\" (UID: \"f8226a25-977f-4934-b40b-7504ab8f23e4\") " pod="openstack/horizon-775dd6779c-j9k6n" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.707881 4848 generic.go:334] "Generic (PLEG): container finished" podID="5a53a69c-528b-4df5-9502-69316c3f345d" containerID="8f981bde4acf13c57b7568f1a14677c9dde5003043b88fe191980311b72a15a4" exitCode=0 Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.707991 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" event={"ID":"5a53a69c-528b-4df5-9502-69316c3f345d","Type":"ContainerDied","Data":"8f981bde4acf13c57b7568f1a14677c9dde5003043b88fe191980311b72a15a4"} Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.708036 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" event={"ID":"5a53a69c-528b-4df5-9502-69316c3f345d","Type":"ContainerStarted","Data":"f9051a4a4a8a18de9820f7f57278f0021e79f8581ae2c476caa0f6cfb3fb89bd"} Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.711905 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ptthx" event={"ID":"9e0d87d6-fd65-4565-971f-070050f2f9ff","Type":"ContainerStarted","Data":"5bc983f81013248609bcd710557e0ae5b6eae67846db99d4bd96bce658eaa39e"} Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.714888 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8226a25-977f-4934-b40b-7504ab8f23e4-logs\") pod \"horizon-775dd6779c-j9k6n\" (UID: \"f8226a25-977f-4934-b40b-7504ab8f23e4\") " pod="openstack/horizon-775dd6779c-j9k6n" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.714933 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8226a25-977f-4934-b40b-7504ab8f23e4-scripts\") pod \"horizon-775dd6779c-j9k6n\" (UID: \"f8226a25-977f-4934-b40b-7504ab8f23e4\") " pod="openstack/horizon-775dd6779c-j9k6n" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.714968 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8226a25-977f-4934-b40b-7504ab8f23e4-config-data\") pod \"horizon-775dd6779c-j9k6n\" (UID: \"f8226a25-977f-4934-b40b-7504ab8f23e4\") " pod="openstack/horizon-775dd6779c-j9k6n" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.715001 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f8226a25-977f-4934-b40b-7504ab8f23e4-horizon-secret-key\") pod \"horizon-775dd6779c-j9k6n\" (UID: \"f8226a25-977f-4934-b40b-7504ab8f23e4\") " pod="openstack/horizon-775dd6779c-j9k6n" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.715054 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fk6fp\" (UniqueName: \"kubernetes.io/projected/f8226a25-977f-4934-b40b-7504ab8f23e4-kube-api-access-fk6fp\") pod \"horizon-775dd6779c-j9k6n\" (UID: \"f8226a25-977f-4934-b40b-7504ab8f23e4\") " pod="openstack/horizon-775dd6779c-j9k6n" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.715861 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8226a25-977f-4934-b40b-7504ab8f23e4-logs\") pod \"horizon-775dd6779c-j9k6n\" (UID: \"f8226a25-977f-4934-b40b-7504ab8f23e4\") " pod="openstack/horizon-775dd6779c-j9k6n" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.717100 4848 generic.go:334] "Generic (PLEG): container finished" podID="c0c27a71-d721-4332-a7f2-827797a0b19f" containerID="586688665afaef707c98cbbdea170ba8dad906394df28084ff6b92626b124410" exitCode=0 Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.717173 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" event={"ID":"c0c27a71-d721-4332-a7f2-827797a0b19f","Type":"ContainerDied","Data":"586688665afaef707c98cbbdea170ba8dad906394df28084ff6b92626b124410"} Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.717198 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" event={"ID":"c0c27a71-d721-4332-a7f2-827797a0b19f","Type":"ContainerDied","Data":"18ab747e7fee14bc6b38ebbb56016bd72daab0f55bd778eb30b1e46b76629853"} Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.717221 4848 scope.go:117] "RemoveContainer" containerID="586688665afaef707c98cbbdea170ba8dad906394df28084ff6b92626b124410" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.717313 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c4fdd6b7-k2vj5" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.719091 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8226a25-977f-4934-b40b-7504ab8f23e4-scripts\") pod \"horizon-775dd6779c-j9k6n\" (UID: \"f8226a25-977f-4934-b40b-7504ab8f23e4\") " pod="openstack/horizon-775dd6779c-j9k6n" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.719569 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8226a25-977f-4934-b40b-7504ab8f23e4-config-data\") pod \"horizon-775dd6779c-j9k6n\" (UID: \"f8226a25-977f-4934-b40b-7504ab8f23e4\") " pod="openstack/horizon-775dd6779c-j9k6n" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.720106 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f8226a25-977f-4934-b40b-7504ab8f23e4-horizon-secret-key\") pod \"horizon-775dd6779c-j9k6n\" (UID: \"f8226a25-977f-4934-b40b-7504ab8f23e4\") " pod="openstack/horizon-775dd6779c-j9k6n" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.742051 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"794ee8e3-ecee-4960-9676-c11a1bf988d1","Type":"ContainerStarted","Data":"c40c24f63d12d510beeede8cdbf50c12e595d1cae3a8dda929a9c05e55491d5c"} Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.745418 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fk6fp\" (UniqueName: \"kubernetes.io/projected/f8226a25-977f-4934-b40b-7504ab8f23e4-kube-api-access-fk6fp\") pod \"horizon-775dd6779c-j9k6n\" (UID: \"f8226a25-977f-4934-b40b-7504ab8f23e4\") " pod="openstack/horizon-775dd6779c-j9k6n" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.747412 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4","Type":"ContainerStarted","Data":"71a9ab82dfdb88ba7ac0219314940d8ff63cbf5cefae1a899858689a8b228420"} Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.775734 4848 scope.go:117] "RemoveContainer" containerID="586688665afaef707c98cbbdea170ba8dad906394df28084ff6b92626b124410" Feb 17 09:21:24 crc kubenswrapper[4848]: E0217 09:21:24.776401 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"586688665afaef707c98cbbdea170ba8dad906394df28084ff6b92626b124410\": container with ID starting with 586688665afaef707c98cbbdea170ba8dad906394df28084ff6b92626b124410 not found: ID does not exist" containerID="586688665afaef707c98cbbdea170ba8dad906394df28084ff6b92626b124410" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.776437 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"586688665afaef707c98cbbdea170ba8dad906394df28084ff6b92626b124410"} err="failed to get container status \"586688665afaef707c98cbbdea170ba8dad906394df28084ff6b92626b124410\": rpc error: code = NotFound desc = could not find container \"586688665afaef707c98cbbdea170ba8dad906394df28084ff6b92626b124410\": container with ID starting with 586688665afaef707c98cbbdea170ba8dad906394df28084ff6b92626b124410 not found: ID does not exist" Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.855985 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-c4fdd6b7-k2vj5"] Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.869112 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-c4fdd6b7-k2vj5"] Feb 17 09:21:24 crc kubenswrapper[4848]: I0217 09:21:24.899101 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-775dd6779c-j9k6n" Feb 17 09:21:25 crc kubenswrapper[4848]: I0217 09:21:25.415157 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0c27a71-d721-4332-a7f2-827797a0b19f" path="/var/lib/kubelet/pods/c0c27a71-d721-4332-a7f2-827797a0b19f/volumes" Feb 17 09:21:25 crc kubenswrapper[4848]: I0217 09:21:25.417097 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r98sw" Feb 17 09:21:25 crc kubenswrapper[4848]: I0217 09:21:25.493229 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r98sw" Feb 17 09:21:25 crc kubenswrapper[4848]: I0217 09:21:25.585323 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-775dd6779c-j9k6n"] Feb 17 09:21:25 crc kubenswrapper[4848]: I0217 09:21:25.750609 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r98sw"] Feb 17 09:21:25 crc kubenswrapper[4848]: I0217 09:21:25.762540 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"794ee8e3-ecee-4960-9676-c11a1bf988d1","Type":"ContainerStarted","Data":"8269c1d1041d03962c00dcb5cfdc6acea52c87c556d0d5f799bb123be105cd15"} Feb 17 09:21:25 crc kubenswrapper[4848]: I0217 09:21:25.764715 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" event={"ID":"5a53a69c-528b-4df5-9502-69316c3f345d","Type":"ContainerStarted","Data":"b9c8eca747406d76a11c9d3666069d1d5f8e52aa19065a6c9ef06176f6369321"} Feb 17 09:21:25 crc kubenswrapper[4848]: I0217 09:21:25.764904 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" Feb 17 09:21:25 crc kubenswrapper[4848]: I0217 09:21:25.767456 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4","Type":"ContainerStarted","Data":"865528d5abf599d39a4c21ab26182a61d99de95a407b3de598a7e1735ddb1ae5"} Feb 17 09:21:25 crc kubenswrapper[4848]: I0217 09:21:25.768649 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-775dd6779c-j9k6n" event={"ID":"f8226a25-977f-4934-b40b-7504ab8f23e4","Type":"ContainerStarted","Data":"2d302a0145acf1b92e15433961d8672c9ac65c76bbb2eb46adcb8070e872fbcb"} Feb 17 09:21:25 crc kubenswrapper[4848]: I0217 09:21:25.791209 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" podStartSLOduration=3.791191356 podStartE2EDuration="3.791191356s" podCreationTimestamp="2026-02-17 09:21:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:21:25.78650593 +0000 UTC m=+963.329761586" watchObservedRunningTime="2026-02-17 09:21:25.791191356 +0000 UTC m=+963.334447002" Feb 17 09:21:26 crc kubenswrapper[4848]: I0217 09:21:26.833684 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4","Type":"ContainerStarted","Data":"ac6fa42deaa2a02c23979dcd4cf755d438c45e25801f01b8395073c61667e26e"} Feb 17 09:21:26 crc kubenswrapper[4848]: I0217 09:21:26.834090 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c9b9c133-cba0-40cd-8d45-553e6c9d9fa4" containerName="glance-log" containerID="cri-o://865528d5abf599d39a4c21ab26182a61d99de95a407b3de598a7e1735ddb1ae5" gracePeriod=30 Feb 17 09:21:26 crc kubenswrapper[4848]: I0217 09:21:26.834363 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c9b9c133-cba0-40cd-8d45-553e6c9d9fa4" containerName="glance-httpd" containerID="cri-o://ac6fa42deaa2a02c23979dcd4cf755d438c45e25801f01b8395073c61667e26e" gracePeriod=30 Feb 17 09:21:26 crc kubenswrapper[4848]: I0217 09:21:26.849626 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="794ee8e3-ecee-4960-9676-c11a1bf988d1" containerName="glance-log" containerID="cri-o://8269c1d1041d03962c00dcb5cfdc6acea52c87c556d0d5f799bb123be105cd15" gracePeriod=30 Feb 17 09:21:26 crc kubenswrapper[4848]: I0217 09:21:26.849854 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"794ee8e3-ecee-4960-9676-c11a1bf988d1","Type":"ContainerStarted","Data":"79028113ed783eab5500a25672d8bee328b17042b779d0531bb0c6fbca5506b4"} Feb 17 09:21:26 crc kubenswrapper[4848]: I0217 09:21:26.849977 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-r98sw" podUID="a097e9f6-c779-42e2-8bbf-767934d60341" containerName="registry-server" containerID="cri-o://0045bb83a18a7b56c8feaf6618b65f2f1b3fe11cb129f86bdc6604c4a7e5a9dd" gracePeriod=2 Feb 17 09:21:26 crc kubenswrapper[4848]: I0217 09:21:26.850054 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="794ee8e3-ecee-4960-9676-c11a1bf988d1" containerName="glance-httpd" containerID="cri-o://79028113ed783eab5500a25672d8bee328b17042b779d0531bb0c6fbca5506b4" gracePeriod=30 Feb 17 09:21:26 crc kubenswrapper[4848]: I0217 09:21:26.862828 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.86281202 podStartE2EDuration="4.86281202s" podCreationTimestamp="2026-02-17 09:21:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:21:26.859392571 +0000 UTC m=+964.402648217" watchObservedRunningTime="2026-02-17 09:21:26.86281202 +0000 UTC m=+964.406067666" Feb 17 09:21:26 crc kubenswrapper[4848]: I0217 09:21:26.888713 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.8886999620000005 podStartE2EDuration="4.888699962s" podCreationTimestamp="2026-02-17 09:21:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:21:26.887989482 +0000 UTC m=+964.431245128" watchObservedRunningTime="2026-02-17 09:21:26.888699962 +0000 UTC m=+964.431955608" Feb 17 09:21:27 crc kubenswrapper[4848]: I0217 09:21:27.872402 4848 generic.go:334] "Generic (PLEG): container finished" podID="794ee8e3-ecee-4960-9676-c11a1bf988d1" containerID="79028113ed783eab5500a25672d8bee328b17042b779d0531bb0c6fbca5506b4" exitCode=0 Feb 17 09:21:27 crc kubenswrapper[4848]: I0217 09:21:27.872891 4848 generic.go:334] "Generic (PLEG): container finished" podID="794ee8e3-ecee-4960-9676-c11a1bf988d1" containerID="8269c1d1041d03962c00dcb5cfdc6acea52c87c556d0d5f799bb123be105cd15" exitCode=143 Feb 17 09:21:27 crc kubenswrapper[4848]: I0217 09:21:27.872678 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"794ee8e3-ecee-4960-9676-c11a1bf988d1","Type":"ContainerDied","Data":"79028113ed783eab5500a25672d8bee328b17042b779d0531bb0c6fbca5506b4"} Feb 17 09:21:27 crc kubenswrapper[4848]: I0217 09:21:27.872965 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"794ee8e3-ecee-4960-9676-c11a1bf988d1","Type":"ContainerDied","Data":"8269c1d1041d03962c00dcb5cfdc6acea52c87c556d0d5f799bb123be105cd15"} Feb 17 09:21:27 crc kubenswrapper[4848]: I0217 09:21:27.876809 4848 generic.go:334] "Generic (PLEG): container finished" podID="a097e9f6-c779-42e2-8bbf-767934d60341" containerID="0045bb83a18a7b56c8feaf6618b65f2f1b3fe11cb129f86bdc6604c4a7e5a9dd" exitCode=0 Feb 17 09:21:27 crc kubenswrapper[4848]: I0217 09:21:27.876860 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r98sw" event={"ID":"a097e9f6-c779-42e2-8bbf-767934d60341","Type":"ContainerDied","Data":"0045bb83a18a7b56c8feaf6618b65f2f1b3fe11cb129f86bdc6604c4a7e5a9dd"} Feb 17 09:21:27 crc kubenswrapper[4848]: I0217 09:21:27.880788 4848 generic.go:334] "Generic (PLEG): container finished" podID="c9b9c133-cba0-40cd-8d45-553e6c9d9fa4" containerID="ac6fa42deaa2a02c23979dcd4cf755d438c45e25801f01b8395073c61667e26e" exitCode=0 Feb 17 09:21:27 crc kubenswrapper[4848]: I0217 09:21:27.880813 4848 generic.go:334] "Generic (PLEG): container finished" podID="c9b9c133-cba0-40cd-8d45-553e6c9d9fa4" containerID="865528d5abf599d39a4c21ab26182a61d99de95a407b3de598a7e1735ddb1ae5" exitCode=143 Feb 17 09:21:27 crc kubenswrapper[4848]: I0217 09:21:27.880834 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4","Type":"ContainerDied","Data":"ac6fa42deaa2a02c23979dcd4cf755d438c45e25801f01b8395073c61667e26e"} Feb 17 09:21:27 crc kubenswrapper[4848]: I0217 09:21:27.880855 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4","Type":"ContainerDied","Data":"865528d5abf599d39a4c21ab26182a61d99de95a407b3de598a7e1735ddb1ae5"} Feb 17 09:21:28 crc kubenswrapper[4848]: I0217 09:21:28.893464 4848 generic.go:334] "Generic (PLEG): container finished" podID="e920b3d7-6812-4953-8baf-e0039e7aa8a7" containerID="a756de62b5a6db3232fab62b0e393c42a2f0617bf011c488351948bf3ac8cc14" exitCode=0 Feb 17 09:21:28 crc kubenswrapper[4848]: I0217 09:21:28.893538 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rb86p" event={"ID":"e920b3d7-6812-4953-8baf-e0039e7aa8a7","Type":"ContainerDied","Data":"a756de62b5a6db3232fab62b0e393c42a2f0617bf011c488351948bf3ac8cc14"} Feb 17 09:21:29 crc kubenswrapper[4848]: I0217 09:21:29.912206 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"794ee8e3-ecee-4960-9676-c11a1bf988d1","Type":"ContainerDied","Data":"c40c24f63d12d510beeede8cdbf50c12e595d1cae3a8dda929a9c05e55491d5c"} Feb 17 09:21:29 crc kubenswrapper[4848]: I0217 09:21:29.912252 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c40c24f63d12d510beeede8cdbf50c12e595d1cae3a8dda929a9c05e55491d5c" Feb 17 09:21:29 crc kubenswrapper[4848]: I0217 09:21:29.913629 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r98sw" Feb 17 09:21:29 crc kubenswrapper[4848]: I0217 09:21:29.918367 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r98sw" event={"ID":"a097e9f6-c779-42e2-8bbf-767934d60341","Type":"ContainerDied","Data":"9bc0322ba50760f17f36a258074edfecb1945790f9c8680f6b5f6fca59a951e1"} Feb 17 09:21:29 crc kubenswrapper[4848]: I0217 09:21:29.918418 4848 scope.go:117] "RemoveContainer" containerID="0045bb83a18a7b56c8feaf6618b65f2f1b3fe11cb129f86bdc6604c4a7e5a9dd" Feb 17 09:21:29 crc kubenswrapper[4848]: I0217 09:21:29.922221 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 09:21:29 crc kubenswrapper[4848]: I0217 09:21:29.926202 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4","Type":"ContainerDied","Data":"71a9ab82dfdb88ba7ac0219314940d8ff63cbf5cefae1a899858689a8b228420"} Feb 17 09:21:29 crc kubenswrapper[4848]: I0217 09:21:29.926247 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="71a9ab82dfdb88ba7ac0219314940d8ff63cbf5cefae1a899858689a8b228420" Feb 17 09:21:29 crc kubenswrapper[4848]: I0217 09:21:29.930174 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a097e9f6-c779-42e2-8bbf-767934d60341-utilities\") pod \"a097e9f6-c779-42e2-8bbf-767934d60341\" (UID: \"a097e9f6-c779-42e2-8bbf-767934d60341\") " Feb 17 09:21:29 crc kubenswrapper[4848]: I0217 09:21:29.930705 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a097e9f6-c779-42e2-8bbf-767934d60341-utilities" (OuterVolumeSpecName: "utilities") pod "a097e9f6-c779-42e2-8bbf-767934d60341" (UID: "a097e9f6-c779-42e2-8bbf-767934d60341"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:21:29 crc kubenswrapper[4848]: I0217 09:21:29.930851 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a097e9f6-c779-42e2-8bbf-767934d60341-catalog-content\") pod \"a097e9f6-c779-42e2-8bbf-767934d60341\" (UID: \"a097e9f6-c779-42e2-8bbf-767934d60341\") " Feb 17 09:21:29 crc kubenswrapper[4848]: I0217 09:21:29.930918 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdkpv\" (UniqueName: \"kubernetes.io/projected/a097e9f6-c779-42e2-8bbf-767934d60341-kube-api-access-xdkpv\") pod \"a097e9f6-c779-42e2-8bbf-767934d60341\" (UID: \"a097e9f6-c779-42e2-8bbf-767934d60341\") " Feb 17 09:21:29 crc kubenswrapper[4848]: I0217 09:21:29.931336 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a097e9f6-c779-42e2-8bbf-767934d60341-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:29 crc kubenswrapper[4848]: I0217 09:21:29.937173 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a097e9f6-c779-42e2-8bbf-767934d60341-kube-api-access-xdkpv" (OuterVolumeSpecName: "kube-api-access-xdkpv") pod "a097e9f6-c779-42e2-8bbf-767934d60341" (UID: "a097e9f6-c779-42e2-8bbf-767934d60341"). InnerVolumeSpecName "kube-api-access-xdkpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:21:29 crc kubenswrapper[4848]: I0217 09:21:29.986769 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.035295 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794ee8e3-ecee-4960-9676-c11a1bf988d1-combined-ca-bundle\") pod \"794ee8e3-ecee-4960-9676-c11a1bf988d1\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.035332 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-httpd-run\") pod \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.035355 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7s5n\" (UniqueName: \"kubernetes.io/projected/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-kube-api-access-g7s5n\") pod \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.035387 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.035426 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/794ee8e3-ecee-4960-9676-c11a1bf988d1-httpd-run\") pod \"794ee8e3-ecee-4960-9676-c11a1bf988d1\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.035456 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/794ee8e3-ecee-4960-9676-c11a1bf988d1-scripts\") pod \"794ee8e3-ecee-4960-9676-c11a1bf988d1\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.035475 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-logs\") pod \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.035505 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-combined-ca-bundle\") pod \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.035523 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/794ee8e3-ecee-4960-9676-c11a1bf988d1-logs\") pod \"794ee8e3-ecee-4960-9676-c11a1bf988d1\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.035570 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxwk7\" (UniqueName: \"kubernetes.io/projected/794ee8e3-ecee-4960-9676-c11a1bf988d1-kube-api-access-xxwk7\") pod \"794ee8e3-ecee-4960-9676-c11a1bf988d1\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.035597 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"794ee8e3-ecee-4960-9676-c11a1bf988d1\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.035616 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-scripts\") pod \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.035633 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/794ee8e3-ecee-4960-9676-c11a1bf988d1-public-tls-certs\") pod \"794ee8e3-ecee-4960-9676-c11a1bf988d1\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.035657 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-config-data\") pod \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.035685 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794ee8e3-ecee-4960-9676-c11a1bf988d1-config-data\") pod \"794ee8e3-ecee-4960-9676-c11a1bf988d1\" (UID: \"794ee8e3-ecee-4960-9676-c11a1bf988d1\") " Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.035735 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-internal-tls-certs\") pod \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\" (UID: \"c9b9c133-cba0-40cd-8d45-553e6c9d9fa4\") " Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.036097 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdkpv\" (UniqueName: \"kubernetes.io/projected/a097e9f6-c779-42e2-8bbf-767934d60341-kube-api-access-xdkpv\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.037358 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c9b9c133-cba0-40cd-8d45-553e6c9d9fa4" (UID: "c9b9c133-cba0-40cd-8d45-553e6c9d9fa4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.040818 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-scripts" (OuterVolumeSpecName: "scripts") pod "c9b9c133-cba0-40cd-8d45-553e6c9d9fa4" (UID: "c9b9c133-cba0-40cd-8d45-553e6c9d9fa4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.041504 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-logs" (OuterVolumeSpecName: "logs") pod "c9b9c133-cba0-40cd-8d45-553e6c9d9fa4" (UID: "c9b9c133-cba0-40cd-8d45-553e6c9d9fa4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.041611 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/794ee8e3-ecee-4960-9676-c11a1bf988d1-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "794ee8e3-ecee-4960-9676-c11a1bf988d1" (UID: "794ee8e3-ecee-4960-9676-c11a1bf988d1"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.041961 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/794ee8e3-ecee-4960-9676-c11a1bf988d1-logs" (OuterVolumeSpecName: "logs") pod "794ee8e3-ecee-4960-9676-c11a1bf988d1" (UID: "794ee8e3-ecee-4960-9676-c11a1bf988d1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.042556 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-kube-api-access-g7s5n" (OuterVolumeSpecName: "kube-api-access-g7s5n") pod "c9b9c133-cba0-40cd-8d45-553e6c9d9fa4" (UID: "c9b9c133-cba0-40cd-8d45-553e6c9d9fa4"). InnerVolumeSpecName "kube-api-access-g7s5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.042587 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a097e9f6-c779-42e2-8bbf-767934d60341-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a097e9f6-c779-42e2-8bbf-767934d60341" (UID: "a097e9f6-c779-42e2-8bbf-767934d60341"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.044247 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "c9b9c133-cba0-40cd-8d45-553e6c9d9fa4" (UID: "c9b9c133-cba0-40cd-8d45-553e6c9d9fa4"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.050100 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794ee8e3-ecee-4960-9676-c11a1bf988d1-scripts" (OuterVolumeSpecName: "scripts") pod "794ee8e3-ecee-4960-9676-c11a1bf988d1" (UID: "794ee8e3-ecee-4960-9676-c11a1bf988d1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.056014 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "794ee8e3-ecee-4960-9676-c11a1bf988d1" (UID: "794ee8e3-ecee-4960-9676-c11a1bf988d1"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.057429 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/794ee8e3-ecee-4960-9676-c11a1bf988d1-kube-api-access-xxwk7" (OuterVolumeSpecName: "kube-api-access-xxwk7") pod "794ee8e3-ecee-4960-9676-c11a1bf988d1" (UID: "794ee8e3-ecee-4960-9676-c11a1bf988d1"). InnerVolumeSpecName "kube-api-access-xxwk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.091450 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9b9c133-cba0-40cd-8d45-553e6c9d9fa4" (UID: "c9b9c133-cba0-40cd-8d45-553e6c9d9fa4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.094222 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-config-data" (OuterVolumeSpecName: "config-data") pod "c9b9c133-cba0-40cd-8d45-553e6c9d9fa4" (UID: "c9b9c133-cba0-40cd-8d45-553e6c9d9fa4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.097916 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794ee8e3-ecee-4960-9676-c11a1bf988d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "794ee8e3-ecee-4960-9676-c11a1bf988d1" (UID: "794ee8e3-ecee-4960-9676-c11a1bf988d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.132669 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c9b9c133-cba0-40cd-8d45-553e6c9d9fa4" (UID: "c9b9c133-cba0-40cd-8d45-553e6c9d9fa4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.138444 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/794ee8e3-ecee-4960-9676-c11a1bf988d1-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.138477 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-logs\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.138531 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.138541 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/794ee8e3-ecee-4960-9676-c11a1bf988d1-logs\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.138551 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxwk7\" (UniqueName: \"kubernetes.io/projected/794ee8e3-ecee-4960-9676-c11a1bf988d1-kube-api-access-xxwk7\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.138601 4848 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.138613 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.138621 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.138630 4848 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.138639 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a097e9f6-c779-42e2-8bbf-767934d60341-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.138648 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/794ee8e3-ecee-4960-9676-c11a1bf988d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.138674 4848 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.138684 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7s5n\" (UniqueName: \"kubernetes.io/projected/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4-kube-api-access-g7s5n\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.138701 4848 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.138711 4848 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/794ee8e3-ecee-4960-9676-c11a1bf988d1-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.156182 4848 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.157594 4848 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.164966 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794ee8e3-ecee-4960-9676-c11a1bf988d1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "794ee8e3-ecee-4960-9676-c11a1bf988d1" (UID: "794ee8e3-ecee-4960-9676-c11a1bf988d1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.165774 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/794ee8e3-ecee-4960-9676-c11a1bf988d1-config-data" (OuterVolumeSpecName: "config-data") pod "794ee8e3-ecee-4960-9676-c11a1bf988d1" (UID: "794ee8e3-ecee-4960-9676-c11a1bf988d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.240661 4848 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.240702 4848 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.240714 4848 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/794ee8e3-ecee-4960-9676-c11a1bf988d1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.240723 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/794ee8e3-ecee-4960-9676-c11a1bf988d1-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.938178 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r98sw" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.938173 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.938238 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 09:21:30 crc kubenswrapper[4848]: I0217 09:21:30.992638 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-r98sw"] Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.006894 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-r98sw"] Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.016268 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.031108 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.042602 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.056777 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.082366 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 09:21:31 crc kubenswrapper[4848]: E0217 09:21:31.083127 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a097e9f6-c779-42e2-8bbf-767934d60341" containerName="extract-content" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.083153 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a097e9f6-c779-42e2-8bbf-767934d60341" containerName="extract-content" Feb 17 09:21:31 crc kubenswrapper[4848]: E0217 09:21:31.083178 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="794ee8e3-ecee-4960-9676-c11a1bf988d1" containerName="glance-log" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.083187 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="794ee8e3-ecee-4960-9676-c11a1bf988d1" containerName="glance-log" Feb 17 09:21:31 crc kubenswrapper[4848]: E0217 09:21:31.083205 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9b9c133-cba0-40cd-8d45-553e6c9d9fa4" containerName="glance-log" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.083213 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b9c133-cba0-40cd-8d45-553e6c9d9fa4" containerName="glance-log" Feb 17 09:21:31 crc kubenswrapper[4848]: E0217 09:21:31.083229 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="794ee8e3-ecee-4960-9676-c11a1bf988d1" containerName="glance-httpd" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.083238 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="794ee8e3-ecee-4960-9676-c11a1bf988d1" containerName="glance-httpd" Feb 17 09:21:31 crc kubenswrapper[4848]: E0217 09:21:31.083258 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a097e9f6-c779-42e2-8bbf-767934d60341" containerName="registry-server" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.083268 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a097e9f6-c779-42e2-8bbf-767934d60341" containerName="registry-server" Feb 17 09:21:31 crc kubenswrapper[4848]: E0217 09:21:31.083302 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9b9c133-cba0-40cd-8d45-553e6c9d9fa4" containerName="glance-httpd" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.083311 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9b9c133-cba0-40cd-8d45-553e6c9d9fa4" containerName="glance-httpd" Feb 17 09:21:31 crc kubenswrapper[4848]: E0217 09:21:31.083335 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a097e9f6-c779-42e2-8bbf-767934d60341" containerName="extract-utilities" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.083345 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a097e9f6-c779-42e2-8bbf-767934d60341" containerName="extract-utilities" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.084550 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="794ee8e3-ecee-4960-9676-c11a1bf988d1" containerName="glance-log" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.084586 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="a097e9f6-c779-42e2-8bbf-767934d60341" containerName="registry-server" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.084614 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="794ee8e3-ecee-4960-9676-c11a1bf988d1" containerName="glance-httpd" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.084639 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9b9c133-cba0-40cd-8d45-553e6c9d9fa4" containerName="glance-httpd" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.084655 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9b9c133-cba0-40cd-8d45-553e6c9d9fa4" containerName="glance-log" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.086369 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.090111 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.090392 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.090426 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-v66vs" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.092348 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.095838 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.097677 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.103566 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.103689 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.103925 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.136455 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.159013 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.159130 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-logs\") pod \"glance-default-internal-api-0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.159406 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.159467 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.159494 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.159571 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.159636 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.160104 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qnrd\" (UniqueName: \"kubernetes.io/projected/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-kube-api-access-8qnrd\") pod \"glance-default-internal-api-0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.263806 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.263870 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.263942 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8222b5-0870-4866-bbc5-2017d0c92585-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.263995 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.264041 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8222b5-0870-4866-bbc5-2017d0c92585-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.264078 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.264100 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qnrd\" (UniqueName: \"kubernetes.io/projected/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-kube-api-access-8qnrd\") pod \"glance-default-internal-api-0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.264260 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.264314 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8222b5-0870-4866-bbc5-2017d0c92585-config-data\") pod \"glance-default-external-api-0\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.264672 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.264913 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.264969 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-logs\") pod \"glance-default-internal-api-0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.264998 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c8222b5-0870-4866-bbc5-2017d0c92585-scripts\") pod \"glance-default-external-api-0\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.265028 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hplkp\" (UniqueName: \"kubernetes.io/projected/8c8222b5-0870-4866-bbc5-2017d0c92585-kube-api-access-hplkp\") pod \"glance-default-external-api-0\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.265086 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c8222b5-0870-4866-bbc5-2017d0c92585-logs\") pod \"glance-default-external-api-0\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.265113 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.265134 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c8222b5-0870-4866-bbc5-2017d0c92585-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.265389 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-logs\") pod \"glance-default-internal-api-0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.265381 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.272064 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.272242 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.273840 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.287106 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.290992 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qnrd\" (UniqueName: \"kubernetes.io/projected/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-kube-api-access-8qnrd\") pod \"glance-default-internal-api-0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.302061 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.366284 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8222b5-0870-4866-bbc5-2017d0c92585-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.366707 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8222b5-0870-4866-bbc5-2017d0c92585-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.366853 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.367096 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8222b5-0870-4866-bbc5-2017d0c92585-config-data\") pod \"glance-default-external-api-0\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.367231 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c8222b5-0870-4866-bbc5-2017d0c92585-scripts\") pod \"glance-default-external-api-0\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.367313 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c8222b5-0870-4866-bbc5-2017d0c92585-logs\") pod \"glance-default-external-api-0\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.367379 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hplkp\" (UniqueName: \"kubernetes.io/projected/8c8222b5-0870-4866-bbc5-2017d0c92585-kube-api-access-hplkp\") pod \"glance-default-external-api-0\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.367528 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c8222b5-0870-4866-bbc5-2017d0c92585-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.368388 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-844c79cc9c-crdch"] Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.367057 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.369675 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c8222b5-0870-4866-bbc5-2017d0c92585-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.369745 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c8222b5-0870-4866-bbc5-2017d0c92585-logs\") pod \"glance-default-external-api-0\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.370691 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8222b5-0870-4866-bbc5-2017d0c92585-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.370822 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8222b5-0870-4866-bbc5-2017d0c92585-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.375578 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c8222b5-0870-4866-bbc5-2017d0c92585-scripts\") pod \"glance-default-external-api-0\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.395555 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8222b5-0870-4866-bbc5-2017d0c92585-config-data\") pod \"glance-default-external-api-0\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.413088 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="794ee8e3-ecee-4960-9676-c11a1bf988d1" path="/var/lib/kubelet/pods/794ee8e3-ecee-4960-9676-c11a1bf988d1/volumes" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.413785 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hplkp\" (UniqueName: \"kubernetes.io/projected/8c8222b5-0870-4866-bbc5-2017d0c92585-kube-api-access-hplkp\") pod \"glance-default-external-api-0\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.416498 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a097e9f6-c779-42e2-8bbf-767934d60341" path="/var/lib/kubelet/pods/a097e9f6-c779-42e2-8bbf-767934d60341/volumes" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.417785 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9b9c133-cba0-40cd-8d45-553e6c9d9fa4" path="/var/lib/kubelet/pods/c9b9c133-cba0-40cd-8d45-553e6c9d9fa4/volumes" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.430150 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.439297 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-749cc47784-q9crv"] Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.441918 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.448346 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.456380 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-749cc47784-q9crv"] Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.460842 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.481617 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.493743 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-775dd6779c-j9k6n"] Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.505173 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-676bdd79dd-lq228"] Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.506799 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.519444 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-676bdd79dd-lq228"] Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.535224 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.535754 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.570899 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1068aa99-55d4-4778-ac02-b354de25d16e-horizon-secret-key\") pod \"horizon-749cc47784-q9crv\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.570942 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1068aa99-55d4-4778-ac02-b354de25d16e-scripts\") pod \"horizon-749cc47784-q9crv\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.570966 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2szpc\" (UniqueName: \"kubernetes.io/projected/1068aa99-55d4-4778-ac02-b354de25d16e-kube-api-access-2szpc\") pod \"horizon-749cc47784-q9crv\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.571029 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1068aa99-55d4-4778-ac02-b354de25d16e-logs\") pod \"horizon-749cc47784-q9crv\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.571064 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1068aa99-55d4-4778-ac02-b354de25d16e-horizon-tls-certs\") pod \"horizon-749cc47784-q9crv\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.571093 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1068aa99-55d4-4778-ac02-b354de25d16e-combined-ca-bundle\") pod \"horizon-749cc47784-q9crv\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.571127 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1068aa99-55d4-4778-ac02-b354de25d16e-config-data\") pod \"horizon-749cc47784-q9crv\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.672332 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/96fd6f0e-96ad-4a88-85ff-78f450b24279-horizon-secret-key\") pod \"horizon-676bdd79dd-lq228\" (UID: \"96fd6f0e-96ad-4a88-85ff-78f450b24279\") " pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.672389 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1068aa99-55d4-4778-ac02-b354de25d16e-horizon-tls-certs\") pod \"horizon-749cc47784-q9crv\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.672440 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1068aa99-55d4-4778-ac02-b354de25d16e-combined-ca-bundle\") pod \"horizon-749cc47784-q9crv\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.672477 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96fd6f0e-96ad-4a88-85ff-78f450b24279-config-data\") pod \"horizon-676bdd79dd-lq228\" (UID: \"96fd6f0e-96ad-4a88-85ff-78f450b24279\") " pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.672520 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1068aa99-55d4-4778-ac02-b354de25d16e-config-data\") pod \"horizon-749cc47784-q9crv\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.672547 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/96fd6f0e-96ad-4a88-85ff-78f450b24279-horizon-tls-certs\") pod \"horizon-676bdd79dd-lq228\" (UID: \"96fd6f0e-96ad-4a88-85ff-78f450b24279\") " pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.672609 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1068aa99-55d4-4778-ac02-b354de25d16e-horizon-secret-key\") pod \"horizon-749cc47784-q9crv\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.672634 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1068aa99-55d4-4778-ac02-b354de25d16e-scripts\") pod \"horizon-749cc47784-q9crv\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.672661 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2szpc\" (UniqueName: \"kubernetes.io/projected/1068aa99-55d4-4778-ac02-b354de25d16e-kube-api-access-2szpc\") pod \"horizon-749cc47784-q9crv\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.672706 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96fd6f0e-96ad-4a88-85ff-78f450b24279-logs\") pod \"horizon-676bdd79dd-lq228\" (UID: \"96fd6f0e-96ad-4a88-85ff-78f450b24279\") " pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.672742 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96fd6f0e-96ad-4a88-85ff-78f450b24279-scripts\") pod \"horizon-676bdd79dd-lq228\" (UID: \"96fd6f0e-96ad-4a88-85ff-78f450b24279\") " pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.672780 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nt7gk\" (UniqueName: \"kubernetes.io/projected/96fd6f0e-96ad-4a88-85ff-78f450b24279-kube-api-access-nt7gk\") pod \"horizon-676bdd79dd-lq228\" (UID: \"96fd6f0e-96ad-4a88-85ff-78f450b24279\") " pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.672819 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1068aa99-55d4-4778-ac02-b354de25d16e-logs\") pod \"horizon-749cc47784-q9crv\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.672839 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96fd6f0e-96ad-4a88-85ff-78f450b24279-combined-ca-bundle\") pod \"horizon-676bdd79dd-lq228\" (UID: \"96fd6f0e-96ad-4a88-85ff-78f450b24279\") " pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.675431 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1068aa99-55d4-4778-ac02-b354de25d16e-logs\") pod \"horizon-749cc47784-q9crv\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.676221 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1068aa99-55d4-4778-ac02-b354de25d16e-scripts\") pod \"horizon-749cc47784-q9crv\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.676858 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1068aa99-55d4-4778-ac02-b354de25d16e-config-data\") pod \"horizon-749cc47784-q9crv\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.676911 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1068aa99-55d4-4778-ac02-b354de25d16e-horizon-tls-certs\") pod \"horizon-749cc47784-q9crv\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.677936 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1068aa99-55d4-4778-ac02-b354de25d16e-combined-ca-bundle\") pod \"horizon-749cc47784-q9crv\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.678754 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1068aa99-55d4-4778-ac02-b354de25d16e-horizon-secret-key\") pod \"horizon-749cc47784-q9crv\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.693129 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2szpc\" (UniqueName: \"kubernetes.io/projected/1068aa99-55d4-4778-ac02-b354de25d16e-kube-api-access-2szpc\") pod \"horizon-749cc47784-q9crv\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.775188 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96fd6f0e-96ad-4a88-85ff-78f450b24279-logs\") pod \"horizon-676bdd79dd-lq228\" (UID: \"96fd6f0e-96ad-4a88-85ff-78f450b24279\") " pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.775272 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96fd6f0e-96ad-4a88-85ff-78f450b24279-scripts\") pod \"horizon-676bdd79dd-lq228\" (UID: \"96fd6f0e-96ad-4a88-85ff-78f450b24279\") " pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.775308 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nt7gk\" (UniqueName: \"kubernetes.io/projected/96fd6f0e-96ad-4a88-85ff-78f450b24279-kube-api-access-nt7gk\") pod \"horizon-676bdd79dd-lq228\" (UID: \"96fd6f0e-96ad-4a88-85ff-78f450b24279\") " pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.775384 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96fd6f0e-96ad-4a88-85ff-78f450b24279-combined-ca-bundle\") pod \"horizon-676bdd79dd-lq228\" (UID: \"96fd6f0e-96ad-4a88-85ff-78f450b24279\") " pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.775551 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/96fd6f0e-96ad-4a88-85ff-78f450b24279-logs\") pod \"horizon-676bdd79dd-lq228\" (UID: \"96fd6f0e-96ad-4a88-85ff-78f450b24279\") " pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.775839 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/96fd6f0e-96ad-4a88-85ff-78f450b24279-horizon-secret-key\") pod \"horizon-676bdd79dd-lq228\" (UID: \"96fd6f0e-96ad-4a88-85ff-78f450b24279\") " pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.776339 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96fd6f0e-96ad-4a88-85ff-78f450b24279-config-data\") pod \"horizon-676bdd79dd-lq228\" (UID: \"96fd6f0e-96ad-4a88-85ff-78f450b24279\") " pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.776028 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96fd6f0e-96ad-4a88-85ff-78f450b24279-scripts\") pod \"horizon-676bdd79dd-lq228\" (UID: \"96fd6f0e-96ad-4a88-85ff-78f450b24279\") " pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.776393 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/96fd6f0e-96ad-4a88-85ff-78f450b24279-horizon-tls-certs\") pod \"horizon-676bdd79dd-lq228\" (UID: \"96fd6f0e-96ad-4a88-85ff-78f450b24279\") " pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.777507 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96fd6f0e-96ad-4a88-85ff-78f450b24279-config-data\") pod \"horizon-676bdd79dd-lq228\" (UID: \"96fd6f0e-96ad-4a88-85ff-78f450b24279\") " pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.779267 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96fd6f0e-96ad-4a88-85ff-78f450b24279-combined-ca-bundle\") pod \"horizon-676bdd79dd-lq228\" (UID: \"96fd6f0e-96ad-4a88-85ff-78f450b24279\") " pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.779787 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/96fd6f0e-96ad-4a88-85ff-78f450b24279-horizon-tls-certs\") pod \"horizon-676bdd79dd-lq228\" (UID: \"96fd6f0e-96ad-4a88-85ff-78f450b24279\") " pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.779944 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/96fd6f0e-96ad-4a88-85ff-78f450b24279-horizon-secret-key\") pod \"horizon-676bdd79dd-lq228\" (UID: \"96fd6f0e-96ad-4a88-85ff-78f450b24279\") " pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.790953 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nt7gk\" (UniqueName: \"kubernetes.io/projected/96fd6f0e-96ad-4a88-85ff-78f450b24279-kube-api-access-nt7gk\") pod \"horizon-676bdd79dd-lq228\" (UID: \"96fd6f0e-96ad-4a88-85ff-78f450b24279\") " pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.793853 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:21:31 crc kubenswrapper[4848]: I0217 09:21:31.835964 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:21:32 crc kubenswrapper[4848]: I0217 09:21:32.687985 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" Feb 17 09:21:32 crc kubenswrapper[4848]: I0217 09:21:32.759144 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-96fb4d4c9-mr8wv"] Feb 17 09:21:32 crc kubenswrapper[4848]: I0217 09:21:32.759374 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" podUID="bcafc0d6-9155-456a-b63d-b5c1944fb51c" containerName="dnsmasq-dns" containerID="cri-o://9a16f27c63c8fd245ee9c7bf24dd8991f885618bde2a6b3c6564277b4466e7e1" gracePeriod=10 Feb 17 09:21:32 crc kubenswrapper[4848]: I0217 09:21:32.980703 4848 generic.go:334] "Generic (PLEG): container finished" podID="bcafc0d6-9155-456a-b63d-b5c1944fb51c" containerID="9a16f27c63c8fd245ee9c7bf24dd8991f885618bde2a6b3c6564277b4466e7e1" exitCode=0 Feb 17 09:21:32 crc kubenswrapper[4848]: I0217 09:21:32.980753 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" event={"ID":"bcafc0d6-9155-456a-b63d-b5c1944fb51c","Type":"ContainerDied","Data":"9a16f27c63c8fd245ee9c7bf24dd8991f885618bde2a6b3c6564277b4466e7e1"} Feb 17 09:21:33 crc kubenswrapper[4848]: I0217 09:21:33.351080 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-nvhmn" Feb 17 09:21:33 crc kubenswrapper[4848]: I0217 09:21:33.405148 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nvhmn"] Feb 17 09:21:33 crc kubenswrapper[4848]: I0217 09:21:33.990129 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-nvhmn" podUID="0a1f257f-6ee5-49f4-8c89-033b43dc561c" containerName="registry-server" containerID="cri-o://7d00dd8337e77677691248151145d973a9aea39f30386d9a854e786330631945" gracePeriod=2 Feb 17 09:21:35 crc kubenswrapper[4848]: I0217 09:21:35.008437 4848 generic.go:334] "Generic (PLEG): container finished" podID="0a1f257f-6ee5-49f4-8c89-033b43dc561c" containerID="7d00dd8337e77677691248151145d973a9aea39f30386d9a854e786330631945" exitCode=0 Feb 17 09:21:35 crc kubenswrapper[4848]: I0217 09:21:35.008483 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvhmn" event={"ID":"0a1f257f-6ee5-49f4-8c89-033b43dc561c","Type":"ContainerDied","Data":"7d00dd8337e77677691248151145d973a9aea39f30386d9a854e786330631945"} Feb 17 09:21:35 crc kubenswrapper[4848]: I0217 09:21:35.852549 4848 scope.go:117] "RemoveContainer" containerID="30c524ee874fd62fee31ffee4aeca3c764124a36cbdedc19eab19f7b76a4a219" Feb 17 09:21:35 crc kubenswrapper[4848]: I0217 09:21:35.949278 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" podUID="bcafc0d6-9155-456a-b63d-b5c1944fb51c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: connect: connection refused" Feb 17 09:21:35 crc kubenswrapper[4848]: I0217 09:21:35.963037 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rb86p" Feb 17 09:21:36 crc kubenswrapper[4848]: I0217 09:21:36.018479 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rb86p" event={"ID":"e920b3d7-6812-4953-8baf-e0039e7aa8a7","Type":"ContainerDied","Data":"b2c312ef0b29f439de77943f585f9d85ac68c34e72f9f9b0f0c7e4b6a1dd7a84"} Feb 17 09:21:36 crc kubenswrapper[4848]: I0217 09:21:36.018507 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rb86p" Feb 17 09:21:36 crc kubenswrapper[4848]: I0217 09:21:36.018519 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2c312ef0b29f439de77943f585f9d85ac68c34e72f9f9b0f0c7e4b6a1dd7a84" Feb 17 09:21:36 crc kubenswrapper[4848]: I0217 09:21:36.100542 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-combined-ca-bundle\") pod \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\" (UID: \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\") " Feb 17 09:21:36 crc kubenswrapper[4848]: I0217 09:21:36.100620 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-config-data\") pod \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\" (UID: \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\") " Feb 17 09:21:36 crc kubenswrapper[4848]: I0217 09:21:36.100823 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-fernet-keys\") pod \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\" (UID: \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\") " Feb 17 09:21:36 crc kubenswrapper[4848]: I0217 09:21:36.100899 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-credential-keys\") pod \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\" (UID: \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\") " Feb 17 09:21:36 crc kubenswrapper[4848]: I0217 09:21:36.100942 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-scripts\") pod \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\" (UID: \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\") " Feb 17 09:21:36 crc kubenswrapper[4848]: I0217 09:21:36.100993 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lznbm\" (UniqueName: \"kubernetes.io/projected/e920b3d7-6812-4953-8baf-e0039e7aa8a7-kube-api-access-lznbm\") pod \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\" (UID: \"e920b3d7-6812-4953-8baf-e0039e7aa8a7\") " Feb 17 09:21:36 crc kubenswrapper[4848]: I0217 09:21:36.106500 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "e920b3d7-6812-4953-8baf-e0039e7aa8a7" (UID: "e920b3d7-6812-4953-8baf-e0039e7aa8a7"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:21:36 crc kubenswrapper[4848]: I0217 09:21:36.106976 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e920b3d7-6812-4953-8baf-e0039e7aa8a7-kube-api-access-lznbm" (OuterVolumeSpecName: "kube-api-access-lznbm") pod "e920b3d7-6812-4953-8baf-e0039e7aa8a7" (UID: "e920b3d7-6812-4953-8baf-e0039e7aa8a7"). InnerVolumeSpecName "kube-api-access-lznbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:21:36 crc kubenswrapper[4848]: I0217 09:21:36.107118 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "e920b3d7-6812-4953-8baf-e0039e7aa8a7" (UID: "e920b3d7-6812-4953-8baf-e0039e7aa8a7"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:21:36 crc kubenswrapper[4848]: I0217 09:21:36.109475 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-scripts" (OuterVolumeSpecName: "scripts") pod "e920b3d7-6812-4953-8baf-e0039e7aa8a7" (UID: "e920b3d7-6812-4953-8baf-e0039e7aa8a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:21:36 crc kubenswrapper[4848]: I0217 09:21:36.124933 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e920b3d7-6812-4953-8baf-e0039e7aa8a7" (UID: "e920b3d7-6812-4953-8baf-e0039e7aa8a7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:21:36 crc kubenswrapper[4848]: I0217 09:21:36.141209 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-config-data" (OuterVolumeSpecName: "config-data") pod "e920b3d7-6812-4953-8baf-e0039e7aa8a7" (UID: "e920b3d7-6812-4953-8baf-e0039e7aa8a7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:21:36 crc kubenswrapper[4848]: I0217 09:21:36.202801 4848 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:36 crc kubenswrapper[4848]: I0217 09:21:36.202832 4848 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:36 crc kubenswrapper[4848]: I0217 09:21:36.202843 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:36 crc kubenswrapper[4848]: I0217 09:21:36.202852 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lznbm\" (UniqueName: \"kubernetes.io/projected/e920b3d7-6812-4953-8baf-e0039e7aa8a7-kube-api-access-lznbm\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:36 crc kubenswrapper[4848]: I0217 09:21:36.202861 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:36 crc kubenswrapper[4848]: I0217 09:21:36.202869 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e920b3d7-6812-4953-8baf-e0039e7aa8a7-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:36 crc kubenswrapper[4848]: E0217 09:21:36.603197 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5ac8ede62671a3b3695cf29bd3a6f124f27c93d1730f9030cc3daa05034d4af4" Feb 17 09:21:36 crc kubenswrapper[4848]: E0217 09:21:36.603817 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central@sha256:5ac8ede62671a3b3695cf29bd3a6f124f27c93d1730f9030cc3daa05034d4af4,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n548hb7h548hfbh55h8bh5f8h597h8dh65fh55fh649h5d6h577hb8h687h65bh5f6hcfh6dh64dh668h697h5d7h644h665h7h657h66h59dh696hbq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-45bz8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(19facc80-e9df-42dc-8124-7619b2167b5c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.137192 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rb86p"] Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.143641 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rb86p"] Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.258272 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-j6p6m"] Feb 17 09:21:37 crc kubenswrapper[4848]: E0217 09:21:37.258944 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e920b3d7-6812-4953-8baf-e0039e7aa8a7" containerName="keystone-bootstrap" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.258968 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="e920b3d7-6812-4953-8baf-e0039e7aa8a7" containerName="keystone-bootstrap" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.259339 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="e920b3d7-6812-4953-8baf-e0039e7aa8a7" containerName="keystone-bootstrap" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.260049 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j6p6m" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.262998 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.263673 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.264043 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-p6mz5" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.264384 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.264979 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-j6p6m"] Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.266497 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.338233 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-config-data\") pod \"keystone-bootstrap-j6p6m\" (UID: \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\") " pod="openstack/keystone-bootstrap-j6p6m" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.338277 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-combined-ca-bundle\") pod \"keystone-bootstrap-j6p6m\" (UID: \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\") " pod="openstack/keystone-bootstrap-j6p6m" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.338347 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-credential-keys\") pod \"keystone-bootstrap-j6p6m\" (UID: \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\") " pod="openstack/keystone-bootstrap-j6p6m" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.338387 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdrf9\" (UniqueName: \"kubernetes.io/projected/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-kube-api-access-gdrf9\") pod \"keystone-bootstrap-j6p6m\" (UID: \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\") " pod="openstack/keystone-bootstrap-j6p6m" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.338559 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-fernet-keys\") pod \"keystone-bootstrap-j6p6m\" (UID: \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\") " pod="openstack/keystone-bootstrap-j6p6m" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.338719 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-scripts\") pod \"keystone-bootstrap-j6p6m\" (UID: \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\") " pod="openstack/keystone-bootstrap-j6p6m" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.392879 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e920b3d7-6812-4953-8baf-e0039e7aa8a7" path="/var/lib/kubelet/pods/e920b3d7-6812-4953-8baf-e0039e7aa8a7/volumes" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.442174 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-scripts\") pod \"keystone-bootstrap-j6p6m\" (UID: \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\") " pod="openstack/keystone-bootstrap-j6p6m" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.442234 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-config-data\") pod \"keystone-bootstrap-j6p6m\" (UID: \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\") " pod="openstack/keystone-bootstrap-j6p6m" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.442304 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-combined-ca-bundle\") pod \"keystone-bootstrap-j6p6m\" (UID: \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\") " pod="openstack/keystone-bootstrap-j6p6m" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.442481 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-credential-keys\") pod \"keystone-bootstrap-j6p6m\" (UID: \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\") " pod="openstack/keystone-bootstrap-j6p6m" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.445449 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdrf9\" (UniqueName: \"kubernetes.io/projected/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-kube-api-access-gdrf9\") pod \"keystone-bootstrap-j6p6m\" (UID: \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\") " pod="openstack/keystone-bootstrap-j6p6m" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.446467 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-fernet-keys\") pod \"keystone-bootstrap-j6p6m\" (UID: \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\") " pod="openstack/keystone-bootstrap-j6p6m" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.448018 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-credential-keys\") pod \"keystone-bootstrap-j6p6m\" (UID: \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\") " pod="openstack/keystone-bootstrap-j6p6m" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.448249 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-config-data\") pod \"keystone-bootstrap-j6p6m\" (UID: \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\") " pod="openstack/keystone-bootstrap-j6p6m" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.450619 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-combined-ca-bundle\") pod \"keystone-bootstrap-j6p6m\" (UID: \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\") " pod="openstack/keystone-bootstrap-j6p6m" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.450935 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-scripts\") pod \"keystone-bootstrap-j6p6m\" (UID: \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\") " pod="openstack/keystone-bootstrap-j6p6m" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.451378 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-fernet-keys\") pod \"keystone-bootstrap-j6p6m\" (UID: \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\") " pod="openstack/keystone-bootstrap-j6p6m" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.468362 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdrf9\" (UniqueName: \"kubernetes.io/projected/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-kube-api-access-gdrf9\") pod \"keystone-bootstrap-j6p6m\" (UID: \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\") " pod="openstack/keystone-bootstrap-j6p6m" Feb 17 09:21:37 crc kubenswrapper[4848]: I0217 09:21:37.596392 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j6p6m" Feb 17 09:21:41 crc kubenswrapper[4848]: E0217 09:21:41.221170 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-placement-api@sha256:657020ed78b5d92505b0b4187dfcf078515484304fd39ce38702d4fb06f4ca36" Feb 17 09:21:41 crc kubenswrapper[4848]: E0217 09:21:41.221797 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:placement-db-sync,Image:quay.io/podified-antelope-centos9/openstack-placement-api@sha256:657020ed78b5d92505b0b4187dfcf078515484304fd39ce38702d4fb06f4ca36,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/placement,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:placement-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-njvmh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42482,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-db-sync-ptthx_openstack(9e0d87d6-fd65-4565-971f-070050f2f9ff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 09:21:41 crc kubenswrapper[4848]: E0217 09:21:41.223031 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/placement-db-sync-ptthx" podUID="9e0d87d6-fd65-4565-971f-070050f2f9ff" Feb 17 09:21:42 crc kubenswrapper[4848]: I0217 09:21:42.075856 4848 generic.go:334] "Generic (PLEG): container finished" podID="6eaa8789-cc44-4571-a25b-b7a7f56668f8" containerID="6074f51855bf7d05823997c5ea7977da776d02f7b85e02c3d39e1c949344f06b" exitCode=0 Feb 17 09:21:42 crc kubenswrapper[4848]: I0217 09:21:42.075928 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zh2hd" event={"ID":"6eaa8789-cc44-4571-a25b-b7a7f56668f8","Type":"ContainerDied","Data":"6074f51855bf7d05823997c5ea7977da776d02f7b85e02c3d39e1c949344f06b"} Feb 17 09:21:42 crc kubenswrapper[4848]: E0217 09:21:42.078687 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"placement-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-placement-api@sha256:657020ed78b5d92505b0b4187dfcf078515484304fd39ce38702d4fb06f4ca36\\\"\"" pod="openstack/placement-db-sync-ptthx" podUID="9e0d87d6-fd65-4565-971f-070050f2f9ff" Feb 17 09:21:43 crc kubenswrapper[4848]: E0217 09:21:43.284678 4848 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7d00dd8337e77677691248151145d973a9aea39f30386d9a854e786330631945 is running failed: container process not found" containerID="7d00dd8337e77677691248151145d973a9aea39f30386d9a854e786330631945" cmd=["grpc_health_probe","-addr=:50051"] Feb 17 09:21:43 crc kubenswrapper[4848]: E0217 09:21:43.285431 4848 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7d00dd8337e77677691248151145d973a9aea39f30386d9a854e786330631945 is running failed: container process not found" containerID="7d00dd8337e77677691248151145d973a9aea39f30386d9a854e786330631945" cmd=["grpc_health_probe","-addr=:50051"] Feb 17 09:21:43 crc kubenswrapper[4848]: E0217 09:21:43.285628 4848 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7d00dd8337e77677691248151145d973a9aea39f30386d9a854e786330631945 is running failed: container process not found" containerID="7d00dd8337e77677691248151145d973a9aea39f30386d9a854e786330631945" cmd=["grpc_health_probe","-addr=:50051"] Feb 17 09:21:43 crc kubenswrapper[4848]: E0217 09:21:43.285669 4848 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7d00dd8337e77677691248151145d973a9aea39f30386d9a854e786330631945 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-nvhmn" podUID="0a1f257f-6ee5-49f4-8c89-033b43dc561c" containerName="registry-server" Feb 17 09:21:45 crc kubenswrapper[4848]: I0217 09:21:45.954804 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" podUID="bcafc0d6-9155-456a-b63d-b5c1944fb51c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: i/o timeout" Feb 17 09:21:48 crc kubenswrapper[4848]: I0217 09:21:48.771692 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:21:48 crc kubenswrapper[4848]: I0217 09:21:48.772160 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:21:49 crc kubenswrapper[4848]: E0217 09:21:49.575453 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:a5f8855b2ed00a661ac827cc3908e540ed2327354ac5a1d39491f4507237b4ec" Feb 17 09:21:49 crc kubenswrapper[4848]: E0217 09:21:49.575785 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:a5f8855b2ed00a661ac827cc3908e540ed2327354ac5a1d39491f4507237b4ec,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d24gr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-jq6l8_openstack(dcb8e782-fe60-4c41-a843-1980fc8ab3cc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 09:21:49 crc kubenswrapper[4848]: E0217 09:21:49.576989 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-jq6l8" podUID="dcb8e782-fe60-4c41-a843-1980fc8ab3cc" Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.677065 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.687616 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nvhmn" Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.708117 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zh2hd" Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.796586 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-config\") pod \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\" (UID: \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\") " Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.796637 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6j4s\" (UniqueName: \"kubernetes.io/projected/0a1f257f-6ee5-49f4-8c89-033b43dc561c-kube-api-access-m6j4s\") pod \"0a1f257f-6ee5-49f4-8c89-033b43dc561c\" (UID: \"0a1f257f-6ee5-49f4-8c89-033b43dc561c\") " Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.796678 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-ovsdbserver-sb\") pod \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\" (UID: \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\") " Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.796726 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm7c4\" (UniqueName: \"kubernetes.io/projected/bcafc0d6-9155-456a-b63d-b5c1944fb51c-kube-api-access-wm7c4\") pod \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\" (UID: \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\") " Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.796805 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-dns-svc\") pod \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\" (UID: \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\") " Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.796836 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-dns-swift-storage-0\") pod \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\" (UID: \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\") " Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.796853 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-ovsdbserver-nb\") pod \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\" (UID: \"bcafc0d6-9155-456a-b63d-b5c1944fb51c\") " Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.796881 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a1f257f-6ee5-49f4-8c89-033b43dc561c-catalog-content\") pod \"0a1f257f-6ee5-49f4-8c89-033b43dc561c\" (UID: \"0a1f257f-6ee5-49f4-8c89-033b43dc561c\") " Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.796972 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a1f257f-6ee5-49f4-8c89-033b43dc561c-utilities\") pod \"0a1f257f-6ee5-49f4-8c89-033b43dc561c\" (UID: \"0a1f257f-6ee5-49f4-8c89-033b43dc561c\") " Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.798495 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a1f257f-6ee5-49f4-8c89-033b43dc561c-utilities" (OuterVolumeSpecName: "utilities") pod "0a1f257f-6ee5-49f4-8c89-033b43dc561c" (UID: "0a1f257f-6ee5-49f4-8c89-033b43dc561c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.803886 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a1f257f-6ee5-49f4-8c89-033b43dc561c-kube-api-access-m6j4s" (OuterVolumeSpecName: "kube-api-access-m6j4s") pod "0a1f257f-6ee5-49f4-8c89-033b43dc561c" (UID: "0a1f257f-6ee5-49f4-8c89-033b43dc561c"). InnerVolumeSpecName "kube-api-access-m6j4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.808120 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcafc0d6-9155-456a-b63d-b5c1944fb51c-kube-api-access-wm7c4" (OuterVolumeSpecName: "kube-api-access-wm7c4") pod "bcafc0d6-9155-456a-b63d-b5c1944fb51c" (UID: "bcafc0d6-9155-456a-b63d-b5c1944fb51c"). InnerVolumeSpecName "kube-api-access-wm7c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.826571 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a1f257f-6ee5-49f4-8c89-033b43dc561c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0a1f257f-6ee5-49f4-8c89-033b43dc561c" (UID: "0a1f257f-6ee5-49f4-8c89-033b43dc561c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.846099 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "bcafc0d6-9155-456a-b63d-b5c1944fb51c" (UID: "bcafc0d6-9155-456a-b63d-b5c1944fb51c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.850310 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bcafc0d6-9155-456a-b63d-b5c1944fb51c" (UID: "bcafc0d6-9155-456a-b63d-b5c1944fb51c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.854393 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bcafc0d6-9155-456a-b63d-b5c1944fb51c" (UID: "bcafc0d6-9155-456a-b63d-b5c1944fb51c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.860674 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-config" (OuterVolumeSpecName: "config") pod "bcafc0d6-9155-456a-b63d-b5c1944fb51c" (UID: "bcafc0d6-9155-456a-b63d-b5c1944fb51c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.866251 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bcafc0d6-9155-456a-b63d-b5c1944fb51c" (UID: "bcafc0d6-9155-456a-b63d-b5c1944fb51c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.899631 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eaa8789-cc44-4571-a25b-b7a7f56668f8-combined-ca-bundle\") pod \"6eaa8789-cc44-4571-a25b-b7a7f56668f8\" (UID: \"6eaa8789-cc44-4571-a25b-b7a7f56668f8\") " Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.899734 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6eaa8789-cc44-4571-a25b-b7a7f56668f8-config\") pod \"6eaa8789-cc44-4571-a25b-b7a7f56668f8\" (UID: \"6eaa8789-cc44-4571-a25b-b7a7f56668f8\") " Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.899805 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nw2dq\" (UniqueName: \"kubernetes.io/projected/6eaa8789-cc44-4571-a25b-b7a7f56668f8-kube-api-access-nw2dq\") pod \"6eaa8789-cc44-4571-a25b-b7a7f56668f8\" (UID: \"6eaa8789-cc44-4571-a25b-b7a7f56668f8\") " Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.900356 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0a1f257f-6ee5-49f4-8c89-033b43dc561c-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.900374 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.900385 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6j4s\" (UniqueName: \"kubernetes.io/projected/0a1f257f-6ee5-49f4-8c89-033b43dc561c-kube-api-access-m6j4s\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.900396 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.900405 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm7c4\" (UniqueName: \"kubernetes.io/projected/bcafc0d6-9155-456a-b63d-b5c1944fb51c-kube-api-access-wm7c4\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.900415 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.900423 4848 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.900432 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bcafc0d6-9155-456a-b63d-b5c1944fb51c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.900441 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0a1f257f-6ee5-49f4-8c89-033b43dc561c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.903260 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eaa8789-cc44-4571-a25b-b7a7f56668f8-kube-api-access-nw2dq" (OuterVolumeSpecName: "kube-api-access-nw2dq") pod "6eaa8789-cc44-4571-a25b-b7a7f56668f8" (UID: "6eaa8789-cc44-4571-a25b-b7a7f56668f8"). InnerVolumeSpecName "kube-api-access-nw2dq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.920057 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eaa8789-cc44-4571-a25b-b7a7f56668f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6eaa8789-cc44-4571-a25b-b7a7f56668f8" (UID: "6eaa8789-cc44-4571-a25b-b7a7f56668f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:21:49 crc kubenswrapper[4848]: I0217 09:21:49.920432 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6eaa8789-cc44-4571-a25b-b7a7f56668f8-config" (OuterVolumeSpecName: "config") pod "6eaa8789-cc44-4571-a25b-b7a7f56668f8" (UID: "6eaa8789-cc44-4571-a25b-b7a7f56668f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.001439 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6eaa8789-cc44-4571-a25b-b7a7f56668f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.001467 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6eaa8789-cc44-4571-a25b-b7a7f56668f8-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.001479 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nw2dq\" (UniqueName: \"kubernetes.io/projected/6eaa8789-cc44-4571-a25b-b7a7f56668f8-kube-api-access-nw2dq\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.151908 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" event={"ID":"bcafc0d6-9155-456a-b63d-b5c1944fb51c","Type":"ContainerDied","Data":"2cb349136673532d0937ad1f21364bf9e2e571f0d03e6177b550ee568248d9b2"} Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.151973 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.158048 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-nvhmn" event={"ID":"0a1f257f-6ee5-49f4-8c89-033b43dc561c","Type":"ContainerDied","Data":"719497241afed82086063d0bf0d3226e147b082c75f4d7783e47db4ba4a08db9"} Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.158079 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-nvhmn" Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.161792 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-zh2hd" Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.162944 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-zh2hd" event={"ID":"6eaa8789-cc44-4571-a25b-b7a7f56668f8","Type":"ContainerDied","Data":"6b4fda0c707564e395863d613f42aec4e31179938891c84dd3104b25dac40c89"} Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.162972 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b4fda0c707564e395863d613f42aec4e31179938891c84dd3104b25dac40c89" Feb 17 09:21:50 crc kubenswrapper[4848]: E0217 09:21:50.163048 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api@sha256:a5f8855b2ed00a661ac827cc3908e540ed2327354ac5a1d39491f4507237b4ec\\\"\"" pod="openstack/barbican-db-sync-jq6l8" podUID="dcb8e782-fe60-4c41-a843-1980fc8ab3cc" Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.206026 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-96fb4d4c9-mr8wv"] Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.216823 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-96fb4d4c9-mr8wv"] Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.226198 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-nvhmn"] Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.234050 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-nvhmn"] Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.901897 4848 scope.go:117] "RemoveContainer" containerID="e4a508a5968beef0972cea08167f706605052013157cad62b0096c07c6c6224f" Feb 17 09:21:50 crc kubenswrapper[4848]: E0217 09:21:50.955343 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b" Feb 17 09:21:50 crc kubenswrapper[4848]: E0217 09:21:50.955565 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rr7h8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-lh5dl_openstack(ba8c67b5-216a-4f60-baee-c1b6211f89ec): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 09:21:50 crc kubenswrapper[4848]: E0217 09:21:50.956995 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-lh5dl" podUID="ba8c67b5-216a-4f60-baee-c1b6211f89ec" Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.960997 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-96fb4d4c9-mr8wv" podUID="bcafc0d6-9155-456a-b63d-b5c1944fb51c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.125:5353: i/o timeout" Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.989053 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f455b5fc7-2s8ss"] Feb 17 09:21:50 crc kubenswrapper[4848]: E0217 09:21:50.989394 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a1f257f-6ee5-49f4-8c89-033b43dc561c" containerName="extract-content" Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.989406 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a1f257f-6ee5-49f4-8c89-033b43dc561c" containerName="extract-content" Feb 17 09:21:50 crc kubenswrapper[4848]: E0217 09:21:50.989427 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcafc0d6-9155-456a-b63d-b5c1944fb51c" containerName="init" Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.989432 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcafc0d6-9155-456a-b63d-b5c1944fb51c" containerName="init" Feb 17 09:21:50 crc kubenswrapper[4848]: E0217 09:21:50.989449 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a1f257f-6ee5-49f4-8c89-033b43dc561c" containerName="registry-server" Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.989455 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a1f257f-6ee5-49f4-8c89-033b43dc561c" containerName="registry-server" Feb 17 09:21:50 crc kubenswrapper[4848]: E0217 09:21:50.989465 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcafc0d6-9155-456a-b63d-b5c1944fb51c" containerName="dnsmasq-dns" Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.989470 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcafc0d6-9155-456a-b63d-b5c1944fb51c" containerName="dnsmasq-dns" Feb 17 09:21:50 crc kubenswrapper[4848]: E0217 09:21:50.989486 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eaa8789-cc44-4571-a25b-b7a7f56668f8" containerName="neutron-db-sync" Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.989491 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eaa8789-cc44-4571-a25b-b7a7f56668f8" containerName="neutron-db-sync" Feb 17 09:21:50 crc kubenswrapper[4848]: E0217 09:21:50.989502 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a1f257f-6ee5-49f4-8c89-033b43dc561c" containerName="extract-utilities" Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.989507 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a1f257f-6ee5-49f4-8c89-033b43dc561c" containerName="extract-utilities" Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.989674 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcafc0d6-9155-456a-b63d-b5c1944fb51c" containerName="dnsmasq-dns" Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.989691 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eaa8789-cc44-4571-a25b-b7a7f56668f8" containerName="neutron-db-sync" Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.989700 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a1f257f-6ee5-49f4-8c89-033b43dc561c" containerName="registry-server" Feb 17 09:21:50 crc kubenswrapper[4848]: I0217 09:21:50.996891 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.012373 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f455b5fc7-2s8ss"] Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.120026 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6cb4bc9fb8-r52gj"] Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.122459 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6cb4bc9fb8-r52gj" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.123466 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-dns-swift-storage-0\") pod \"dnsmasq-dns-6f455b5fc7-2s8ss\" (UID: \"72029462-f77a-48a6-8fcc-49d2e9ab7046\") " pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.123585 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvmlc\" (UniqueName: \"kubernetes.io/projected/72029462-f77a-48a6-8fcc-49d2e9ab7046-kube-api-access-gvmlc\") pod \"dnsmasq-dns-6f455b5fc7-2s8ss\" (UID: \"72029462-f77a-48a6-8fcc-49d2e9ab7046\") " pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.123627 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-config\") pod \"dnsmasq-dns-6f455b5fc7-2s8ss\" (UID: \"72029462-f77a-48a6-8fcc-49d2e9ab7046\") " pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.123653 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-ovsdbserver-nb\") pod \"dnsmasq-dns-6f455b5fc7-2s8ss\" (UID: \"72029462-f77a-48a6-8fcc-49d2e9ab7046\") " pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.123669 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-dns-svc\") pod \"dnsmasq-dns-6f455b5fc7-2s8ss\" (UID: \"72029462-f77a-48a6-8fcc-49d2e9ab7046\") " pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.123707 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-ovsdbserver-sb\") pod \"dnsmasq-dns-6f455b5fc7-2s8ss\" (UID: \"72029462-f77a-48a6-8fcc-49d2e9ab7046\") " pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.125031 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.125147 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-nkq89" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.125187 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.125278 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.128996 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6cb4bc9fb8-r52gj"] Feb 17 09:21:51 crc kubenswrapper[4848]: E0217 09:21:51.180815 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api@sha256:3fa6e687aa002b92fedbfe2c1ccaa2906b399c58d17bf9ecece2c4cd69a0210b\\\"\"" pod="openstack/cinder-db-sync-lh5dl" podUID="ba8c67b5-216a-4f60-baee-c1b6211f89ec" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.225922 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-dns-swift-storage-0\") pod \"dnsmasq-dns-6f455b5fc7-2s8ss\" (UID: \"72029462-f77a-48a6-8fcc-49d2e9ab7046\") " pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.225974 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg6dv\" (UniqueName: \"kubernetes.io/projected/60ec54d0-c985-48d2-b081-c1082d132e65-kube-api-access-cg6dv\") pod \"neutron-6cb4bc9fb8-r52gj\" (UID: \"60ec54d0-c985-48d2-b081-c1082d132e65\") " pod="openstack/neutron-6cb4bc9fb8-r52gj" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.226001 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ec54d0-c985-48d2-b081-c1082d132e65-combined-ca-bundle\") pod \"neutron-6cb4bc9fb8-r52gj\" (UID: \"60ec54d0-c985-48d2-b081-c1082d132e65\") " pod="openstack/neutron-6cb4bc9fb8-r52gj" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.226134 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/60ec54d0-c985-48d2-b081-c1082d132e65-ovndb-tls-certs\") pod \"neutron-6cb4bc9fb8-r52gj\" (UID: \"60ec54d0-c985-48d2-b081-c1082d132e65\") " pod="openstack/neutron-6cb4bc9fb8-r52gj" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.226177 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/60ec54d0-c985-48d2-b081-c1082d132e65-httpd-config\") pod \"neutron-6cb4bc9fb8-r52gj\" (UID: \"60ec54d0-c985-48d2-b081-c1082d132e65\") " pod="openstack/neutron-6cb4bc9fb8-r52gj" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.226198 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/60ec54d0-c985-48d2-b081-c1082d132e65-config\") pod \"neutron-6cb4bc9fb8-r52gj\" (UID: \"60ec54d0-c985-48d2-b081-c1082d132e65\") " pod="openstack/neutron-6cb4bc9fb8-r52gj" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.226635 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvmlc\" (UniqueName: \"kubernetes.io/projected/72029462-f77a-48a6-8fcc-49d2e9ab7046-kube-api-access-gvmlc\") pod \"dnsmasq-dns-6f455b5fc7-2s8ss\" (UID: \"72029462-f77a-48a6-8fcc-49d2e9ab7046\") " pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.226659 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-dns-swift-storage-0\") pod \"dnsmasq-dns-6f455b5fc7-2s8ss\" (UID: \"72029462-f77a-48a6-8fcc-49d2e9ab7046\") " pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.226707 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-config\") pod \"dnsmasq-dns-6f455b5fc7-2s8ss\" (UID: \"72029462-f77a-48a6-8fcc-49d2e9ab7046\") " pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.226746 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-ovsdbserver-nb\") pod \"dnsmasq-dns-6f455b5fc7-2s8ss\" (UID: \"72029462-f77a-48a6-8fcc-49d2e9ab7046\") " pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.226785 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-dns-svc\") pod \"dnsmasq-dns-6f455b5fc7-2s8ss\" (UID: \"72029462-f77a-48a6-8fcc-49d2e9ab7046\") " pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.226845 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-ovsdbserver-sb\") pod \"dnsmasq-dns-6f455b5fc7-2s8ss\" (UID: \"72029462-f77a-48a6-8fcc-49d2e9ab7046\") " pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.227489 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-ovsdbserver-sb\") pod \"dnsmasq-dns-6f455b5fc7-2s8ss\" (UID: \"72029462-f77a-48a6-8fcc-49d2e9ab7046\") " pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.228035 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-dns-svc\") pod \"dnsmasq-dns-6f455b5fc7-2s8ss\" (UID: \"72029462-f77a-48a6-8fcc-49d2e9ab7046\") " pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.228286 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-config\") pod \"dnsmasq-dns-6f455b5fc7-2s8ss\" (UID: \"72029462-f77a-48a6-8fcc-49d2e9ab7046\") " pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.229821 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-ovsdbserver-nb\") pod \"dnsmasq-dns-6f455b5fc7-2s8ss\" (UID: \"72029462-f77a-48a6-8fcc-49d2e9ab7046\") " pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.246887 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvmlc\" (UniqueName: \"kubernetes.io/projected/72029462-f77a-48a6-8fcc-49d2e9ab7046-kube-api-access-gvmlc\") pod \"dnsmasq-dns-6f455b5fc7-2s8ss\" (UID: \"72029462-f77a-48a6-8fcc-49d2e9ab7046\") " pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.327910 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg6dv\" (UniqueName: \"kubernetes.io/projected/60ec54d0-c985-48d2-b081-c1082d132e65-kube-api-access-cg6dv\") pod \"neutron-6cb4bc9fb8-r52gj\" (UID: \"60ec54d0-c985-48d2-b081-c1082d132e65\") " pod="openstack/neutron-6cb4bc9fb8-r52gj" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.327954 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ec54d0-c985-48d2-b081-c1082d132e65-combined-ca-bundle\") pod \"neutron-6cb4bc9fb8-r52gj\" (UID: \"60ec54d0-c985-48d2-b081-c1082d132e65\") " pod="openstack/neutron-6cb4bc9fb8-r52gj" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.327988 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/60ec54d0-c985-48d2-b081-c1082d132e65-ovndb-tls-certs\") pod \"neutron-6cb4bc9fb8-r52gj\" (UID: \"60ec54d0-c985-48d2-b081-c1082d132e65\") " pod="openstack/neutron-6cb4bc9fb8-r52gj" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.328009 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/60ec54d0-c985-48d2-b081-c1082d132e65-httpd-config\") pod \"neutron-6cb4bc9fb8-r52gj\" (UID: \"60ec54d0-c985-48d2-b081-c1082d132e65\") " pod="openstack/neutron-6cb4bc9fb8-r52gj" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.328024 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/60ec54d0-c985-48d2-b081-c1082d132e65-config\") pod \"neutron-6cb4bc9fb8-r52gj\" (UID: \"60ec54d0-c985-48d2-b081-c1082d132e65\") " pod="openstack/neutron-6cb4bc9fb8-r52gj" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.328682 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.332872 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ec54d0-c985-48d2-b081-c1082d132e65-combined-ca-bundle\") pod \"neutron-6cb4bc9fb8-r52gj\" (UID: \"60ec54d0-c985-48d2-b081-c1082d132e65\") " pod="openstack/neutron-6cb4bc9fb8-r52gj" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.333953 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/60ec54d0-c985-48d2-b081-c1082d132e65-httpd-config\") pod \"neutron-6cb4bc9fb8-r52gj\" (UID: \"60ec54d0-c985-48d2-b081-c1082d132e65\") " pod="openstack/neutron-6cb4bc9fb8-r52gj" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.335710 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/60ec54d0-c985-48d2-b081-c1082d132e65-ovndb-tls-certs\") pod \"neutron-6cb4bc9fb8-r52gj\" (UID: \"60ec54d0-c985-48d2-b081-c1082d132e65\") " pod="openstack/neutron-6cb4bc9fb8-r52gj" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.336576 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/60ec54d0-c985-48d2-b081-c1082d132e65-config\") pod \"neutron-6cb4bc9fb8-r52gj\" (UID: \"60ec54d0-c985-48d2-b081-c1082d132e65\") " pod="openstack/neutron-6cb4bc9fb8-r52gj" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.359635 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg6dv\" (UniqueName: \"kubernetes.io/projected/60ec54d0-c985-48d2-b081-c1082d132e65-kube-api-access-cg6dv\") pod \"neutron-6cb4bc9fb8-r52gj\" (UID: \"60ec54d0-c985-48d2-b081-c1082d132e65\") " pod="openstack/neutron-6cb4bc9fb8-r52gj" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.393153 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a1f257f-6ee5-49f4-8c89-033b43dc561c" path="/var/lib/kubelet/pods/0a1f257f-6ee5-49f4-8c89-033b43dc561c/volumes" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.394133 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcafc0d6-9155-456a-b63d-b5c1944fb51c" path="/var/lib/kubelet/pods/bcafc0d6-9155-456a-b63d-b5c1944fb51c/volumes" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.437333 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6cb4bc9fb8-r52gj" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.527339 4848 scope.go:117] "RemoveContainer" containerID="9a16f27c63c8fd245ee9c7bf24dd8991f885618bde2a6b3c6564277b4466e7e1" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.592897 4848 scope.go:117] "RemoveContainer" containerID="915958aff7309f7c86128c894bb730fb7b32670baf1c2043ad27102c1fd5c13d" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.661356 4848 scope.go:117] "RemoveContainer" containerID="7d00dd8337e77677691248151145d973a9aea39f30386d9a854e786330631945" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.721213 4848 scope.go:117] "RemoveContainer" containerID="9f5df400784414601b0dcd5d77928319c121988940c983ab6db1f0026940739f" Feb 17 09:21:51 crc kubenswrapper[4848]: I0217 09:21:51.756411 4848 scope.go:117] "RemoveContainer" containerID="eb3c14030316fe503800a30601339d401b67e1276db8018c1910716c9e81b1da" Feb 17 09:21:52 crc kubenswrapper[4848]: I0217 09:21:52.221505 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-749cc47784-q9crv"] Feb 17 09:21:52 crc kubenswrapper[4848]: I0217 09:21:52.233254 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-775dd6779c-j9k6n" event={"ID":"f8226a25-977f-4934-b40b-7504ab8f23e4","Type":"ContainerStarted","Data":"276455a72da624622d24fe647d83eb57d4649914bdffc56aef69370957d54dad"} Feb 17 09:21:52 crc kubenswrapper[4848]: I0217 09:21:52.233401 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-775dd6779c-j9k6n" event={"ID":"f8226a25-977f-4934-b40b-7504ab8f23e4","Type":"ContainerStarted","Data":"5f3c2dcbbc6ea1b986ce34c527d927661d7665ba59bc45f1b819435a7303a7c4"} Feb 17 09:21:52 crc kubenswrapper[4848]: I0217 09:21:52.233598 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-775dd6779c-j9k6n" podUID="f8226a25-977f-4934-b40b-7504ab8f23e4" containerName="horizon-log" containerID="cri-o://5f3c2dcbbc6ea1b986ce34c527d927661d7665ba59bc45f1b819435a7303a7c4" gracePeriod=30 Feb 17 09:21:52 crc kubenswrapper[4848]: I0217 09:21:52.234115 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-775dd6779c-j9k6n" podUID="f8226a25-977f-4934-b40b-7504ab8f23e4" containerName="horizon" containerID="cri-o://276455a72da624622d24fe647d83eb57d4649914bdffc56aef69370957d54dad" gracePeriod=30 Feb 17 09:21:52 crc kubenswrapper[4848]: I0217 09:21:52.258415 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57bbdfc68f-j9b8f" event={"ID":"d790c2ab-67aa-4c46-9407-9fa991223dd0","Type":"ContainerStarted","Data":"6993accf01497b8287cf6562a92fd53c6e9d1c384ab683be59ec133035931f6f"} Feb 17 09:21:52 crc kubenswrapper[4848]: I0217 09:21:52.258695 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57bbdfc68f-j9b8f" event={"ID":"d790c2ab-67aa-4c46-9407-9fa991223dd0","Type":"ContainerStarted","Data":"0af8c9459567215b1bb5f1aca4bdbe716fa1f64aedf6b3a4c8c6ce8750551728"} Feb 17 09:21:52 crc kubenswrapper[4848]: I0217 09:21:52.258826 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-57bbdfc68f-j9b8f" podUID="d790c2ab-67aa-4c46-9407-9fa991223dd0" containerName="horizon-log" containerID="cri-o://0af8c9459567215b1bb5f1aca4bdbe716fa1f64aedf6b3a4c8c6ce8750551728" gracePeriod=30 Feb 17 09:21:52 crc kubenswrapper[4848]: I0217 09:21:52.258906 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-57bbdfc68f-j9b8f" podUID="d790c2ab-67aa-4c46-9407-9fa991223dd0" containerName="horizon" containerID="cri-o://6993accf01497b8287cf6562a92fd53c6e9d1c384ab683be59ec133035931f6f" gracePeriod=30 Feb 17 09:21:52 crc kubenswrapper[4848]: I0217 09:21:52.259523 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-676bdd79dd-lq228"] Feb 17 09:21:52 crc kubenswrapper[4848]: I0217 09:21:52.286128 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-844c79cc9c-crdch" podUID="adcdc585-f670-4bc3-be54-acf796d438df" containerName="horizon-log" containerID="cri-o://129fe4ebbb776d354652314ab286f6852b2eda985baaa608ebb67534453727d9" gracePeriod=30 Feb 17 09:21:52 crc kubenswrapper[4848]: I0217 09:21:52.286350 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844c79cc9c-crdch" event={"ID":"adcdc585-f670-4bc3-be54-acf796d438df","Type":"ContainerStarted","Data":"558fcd89e85a1796d184735aafca39c6f5a56a859a594d7ec3af0dd0a0436a8b"} Feb 17 09:21:52 crc kubenswrapper[4848]: I0217 09:21:52.286375 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844c79cc9c-crdch" event={"ID":"adcdc585-f670-4bc3-be54-acf796d438df","Type":"ContainerStarted","Data":"129fe4ebbb776d354652314ab286f6852b2eda985baaa608ebb67534453727d9"} Feb 17 09:21:52 crc kubenswrapper[4848]: I0217 09:21:52.286420 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-844c79cc9c-crdch" podUID="adcdc585-f670-4bc3-be54-acf796d438df" containerName="horizon" containerID="cri-o://558fcd89e85a1796d184735aafca39c6f5a56a859a594d7ec3af0dd0a0436a8b" gracePeriod=30 Feb 17 09:21:52 crc kubenswrapper[4848]: I0217 09:21:52.302522 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-775dd6779c-j9k6n" podStartSLOduration=2.454686196 podStartE2EDuration="28.302505435s" podCreationTimestamp="2026-02-17 09:21:24 +0000 UTC" firstStartedPulling="2026-02-17 09:21:25.622831306 +0000 UTC m=+963.166086952" lastFinishedPulling="2026-02-17 09:21:51.470650545 +0000 UTC m=+989.013906191" observedRunningTime="2026-02-17 09:21:52.293384631 +0000 UTC m=+989.836640287" watchObservedRunningTime="2026-02-17 09:21:52.302505435 +0000 UTC m=+989.845761081" Feb 17 09:21:52 crc kubenswrapper[4848]: I0217 09:21:52.311983 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f455b5fc7-2s8ss"] Feb 17 09:21:52 crc kubenswrapper[4848]: I0217 09:21:52.312530 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19facc80-e9df-42dc-8124-7619b2167b5c","Type":"ContainerStarted","Data":"d694ea51f368836cbddab6d27bab1acb6fcbace326a0b7a7bd157af6be65915f"} Feb 17 09:21:52 crc kubenswrapper[4848]: I0217 09:21:52.344824 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-j6p6m"] Feb 17 09:21:52 crc kubenswrapper[4848]: I0217 09:21:52.347096 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-844c79cc9c-crdch" podStartSLOduration=3.014962038 podStartE2EDuration="31.34707747s" podCreationTimestamp="2026-02-17 09:21:21 +0000 UTC" firstStartedPulling="2026-02-17 09:21:23.14083783 +0000 UTC m=+960.684093476" lastFinishedPulling="2026-02-17 09:21:51.472953262 +0000 UTC m=+989.016208908" observedRunningTime="2026-02-17 09:21:52.322936099 +0000 UTC m=+989.866191735" watchObservedRunningTime="2026-02-17 09:21:52.34707747 +0000 UTC m=+989.890333116" Feb 17 09:21:52 crc kubenswrapper[4848]: I0217 09:21:52.380836 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-57bbdfc68f-j9b8f" podStartSLOduration=4.071763425 podStartE2EDuration="30.38081862s" podCreationTimestamp="2026-02-17 09:21:22 +0000 UTC" firstStartedPulling="2026-02-17 09:21:23.289306802 +0000 UTC m=+960.832562448" lastFinishedPulling="2026-02-17 09:21:49.598361987 +0000 UTC m=+987.141617643" observedRunningTime="2026-02-17 09:21:52.367202525 +0000 UTC m=+989.910458171" watchObservedRunningTime="2026-02-17 09:21:52.38081862 +0000 UTC m=+989.924074256" Feb 17 09:21:52 crc kubenswrapper[4848]: I0217 09:21:52.398353 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 09:21:52 crc kubenswrapper[4848]: W0217 09:21:52.423166 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9a50ad8_91ed_455d_9810_6f0dc1fb03f0.slice/crio-d85988bde8daa1e446cab6f039951b7497a2918bf692d890663ac5d3155ad02e WatchSource:0}: Error finding container d85988bde8daa1e446cab6f039951b7497a2918bf692d890663ac5d3155ad02e: Status 404 returned error can't find the container with id d85988bde8daa1e446cab6f039951b7497a2918bf692d890663ac5d3155ad02e Feb 17 09:21:52 crc kubenswrapper[4848]: I0217 09:21:52.519154 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 09:21:52 crc kubenswrapper[4848]: I0217 09:21:52.575987 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-57bbdfc68f-j9b8f" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.071624 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5df76f45d5-hgxv9"] Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.073360 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.074877 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.075133 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.120206 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5df76f45d5-hgxv9"] Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.172825 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6cb4bc9fb8-r52gj"] Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.177625 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-combined-ca-bundle\") pod \"neutron-5df76f45d5-hgxv9\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.177668 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-config\") pod \"neutron-5df76f45d5-hgxv9\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.177730 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-ovndb-tls-certs\") pod \"neutron-5df76f45d5-hgxv9\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.177777 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-internal-tls-certs\") pod \"neutron-5df76f45d5-hgxv9\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.177797 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-795rc\" (UniqueName: \"kubernetes.io/projected/e37f2d41-bace-4ca3-a811-36b44ee278d4-kube-api-access-795rc\") pod \"neutron-5df76f45d5-hgxv9\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.177814 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-httpd-config\") pod \"neutron-5df76f45d5-hgxv9\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.177852 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-public-tls-certs\") pod \"neutron-5df76f45d5-hgxv9\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.279131 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-ovndb-tls-certs\") pod \"neutron-5df76f45d5-hgxv9\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.279194 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-internal-tls-certs\") pod \"neutron-5df76f45d5-hgxv9\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.279218 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-795rc\" (UniqueName: \"kubernetes.io/projected/e37f2d41-bace-4ca3-a811-36b44ee278d4-kube-api-access-795rc\") pod \"neutron-5df76f45d5-hgxv9\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.279240 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-httpd-config\") pod \"neutron-5df76f45d5-hgxv9\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.279258 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-public-tls-certs\") pod \"neutron-5df76f45d5-hgxv9\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.279321 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-combined-ca-bundle\") pod \"neutron-5df76f45d5-hgxv9\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.279344 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-config\") pod \"neutron-5df76f45d5-hgxv9\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.285107 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-config\") pod \"neutron-5df76f45d5-hgxv9\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.286393 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-combined-ca-bundle\") pod \"neutron-5df76f45d5-hgxv9\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.286937 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-public-tls-certs\") pod \"neutron-5df76f45d5-hgxv9\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.287383 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-ovndb-tls-certs\") pod \"neutron-5df76f45d5-hgxv9\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.287412 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-internal-tls-certs\") pod \"neutron-5df76f45d5-hgxv9\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.288961 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-httpd-config\") pod \"neutron-5df76f45d5-hgxv9\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.302968 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-795rc\" (UniqueName: \"kubernetes.io/projected/e37f2d41-bace-4ca3-a811-36b44ee278d4-kube-api-access-795rc\") pod \"neutron-5df76f45d5-hgxv9\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.339299 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6cb4bc9fb8-r52gj" event={"ID":"60ec54d0-c985-48d2-b081-c1082d132e65","Type":"ContainerStarted","Data":"fb743eed00414d9aec17782d468240a35660f5722df178e955a50ce240ef2cc8"} Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.341647 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-749cc47784-q9crv" event={"ID":"1068aa99-55d4-4778-ac02-b354de25d16e","Type":"ContainerStarted","Data":"0cafaca3851618f0ffdb14a0cf40a0bb789ede4dd0cfff526ba29a414b7a3d5c"} Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.341698 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-749cc47784-q9crv" event={"ID":"1068aa99-55d4-4778-ac02-b354de25d16e","Type":"ContainerStarted","Data":"479b497cdf95e27b9cf69855fe7ee2bce4facd9452020fe51dd99e0da1ac4e2a"} Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.341710 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-749cc47784-q9crv" event={"ID":"1068aa99-55d4-4778-ac02-b354de25d16e","Type":"ContainerStarted","Data":"8eac9682f7ccd155e82d400a93cd4fd694731032d22be3d11804e259800056cb"} Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.355209 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0","Type":"ContainerStarted","Data":"df3868e8ac577f0318c601dc6206914260d56d56fa100eef6d537919257171fd"} Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.355247 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0","Type":"ContainerStarted","Data":"d85988bde8daa1e446cab6f039951b7497a2918bf692d890663ac5d3155ad02e"} Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.379004 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-749cc47784-q9crv" podStartSLOduration=22.378987311 podStartE2EDuration="22.378987311s" podCreationTimestamp="2026-02-17 09:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:21:53.363925713 +0000 UTC m=+990.907181359" watchObservedRunningTime="2026-02-17 09:21:53.378987311 +0000 UTC m=+990.922242957" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.386501 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-676bdd79dd-lq228" event={"ID":"96fd6f0e-96ad-4a88-85ff-78f450b24279","Type":"ContainerStarted","Data":"444738db4ecdb64c9972fb683d299988b65572dd1deaeea4c9b6bf878b52aad4"} Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.386555 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-676bdd79dd-lq228" event={"ID":"96fd6f0e-96ad-4a88-85ff-78f450b24279","Type":"ContainerStarted","Data":"cae8aa67c61a2478537ae3ad6516e549eddb5d49e6711fd1187b019432b5964b"} Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.386568 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-676bdd79dd-lq228" event={"ID":"96fd6f0e-96ad-4a88-85ff-78f450b24279","Type":"ContainerStarted","Data":"6931205b154084a719f3161641f7df95f7cdc9363a0af136684c0b12bc0032dd"} Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.400984 4848 generic.go:334] "Generic (PLEG): container finished" podID="72029462-f77a-48a6-8fcc-49d2e9ab7046" containerID="a504accb14792dc54990de89dfec862cd8b5606f69498576ad67ed994807df6b" exitCode=0 Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.408653 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" event={"ID":"72029462-f77a-48a6-8fcc-49d2e9ab7046","Type":"ContainerDied","Data":"a504accb14792dc54990de89dfec862cd8b5606f69498576ad67ed994807df6b"} Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.408694 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" event={"ID":"72029462-f77a-48a6-8fcc-49d2e9ab7046","Type":"ContainerStarted","Data":"c2215ee1bd9cfd1830fabe03d586071621d75d7b1e56f81bacdcab8aa694a3aa"} Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.421349 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.429990 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j6p6m" event={"ID":"0dd600b5-f56b-460a-acb0-a4dc5fd2de23","Type":"ContainerStarted","Data":"88d0e2594cadd88bd48db0488f569383e741546126bbcdfc360e84ebf2c60057"} Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.430069 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j6p6m" event={"ID":"0dd600b5-f56b-460a-acb0-a4dc5fd2de23","Type":"ContainerStarted","Data":"909dcb6a6020b6abc1c40d93046ade2ae815508ecdd0313365ebf0fdba8020b9"} Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.448024 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c8222b5-0870-4866-bbc5-2017d0c92585","Type":"ContainerStarted","Data":"f47faa6c5da2226694633f1385a602ec112fece4dfd654ac67e6d71d875e949d"} Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.614971 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-676bdd79dd-lq228" podStartSLOduration=22.614953903 podStartE2EDuration="22.614953903s" podCreationTimestamp="2026-02-17 09:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:21:53.595122087 +0000 UTC m=+991.138377733" watchObservedRunningTime="2026-02-17 09:21:53.614953903 +0000 UTC m=+991.158209549" Feb 17 09:21:53 crc kubenswrapper[4848]: I0217 09:21:53.652246 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-j6p6m" podStartSLOduration=16.652229476 podStartE2EDuration="16.652229476s" podCreationTimestamp="2026-02-17 09:21:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:21:53.634380868 +0000 UTC m=+991.177636534" watchObservedRunningTime="2026-02-17 09:21:53.652229476 +0000 UTC m=+991.195485122" Feb 17 09:21:54 crc kubenswrapper[4848]: I0217 09:21:54.152839 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5df76f45d5-hgxv9"] Feb 17 09:21:54 crc kubenswrapper[4848]: I0217 09:21:54.468326 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5df76f45d5-hgxv9" event={"ID":"e37f2d41-bace-4ca3-a811-36b44ee278d4","Type":"ContainerStarted","Data":"e0cfcb15b1d4aaa58b2591c7e99cd1ba80de439c93dc40cba571166b780b4d51"} Feb 17 09:21:54 crc kubenswrapper[4848]: I0217 09:21:54.470870 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" event={"ID":"72029462-f77a-48a6-8fcc-49d2e9ab7046","Type":"ContainerStarted","Data":"ea745fa85b8e3b13ed9fb943d71c5ad63bbc9aef48550100f2b90a443b5b3cba"} Feb 17 09:21:54 crc kubenswrapper[4848]: I0217 09:21:54.471375 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" Feb 17 09:21:54 crc kubenswrapper[4848]: I0217 09:21:54.474486 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c8222b5-0870-4866-bbc5-2017d0c92585","Type":"ContainerStarted","Data":"80020213f66ca77df170458f285fbdbaefff874e8b7e7ed1a0567df823777379"} Feb 17 09:21:54 crc kubenswrapper[4848]: I0217 09:21:54.481064 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6cb4bc9fb8-r52gj" event={"ID":"60ec54d0-c985-48d2-b081-c1082d132e65","Type":"ContainerStarted","Data":"f6ea2a2dd8077ea22a1495dc5fd86e223ff5877b1de922b517b656dd0480dc14"} Feb 17 09:21:54 crc kubenswrapper[4848]: I0217 09:21:54.481101 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6cb4bc9fb8-r52gj" Feb 17 09:21:54 crc kubenswrapper[4848]: I0217 09:21:54.481112 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6cb4bc9fb8-r52gj" event={"ID":"60ec54d0-c985-48d2-b081-c1082d132e65","Type":"ContainerStarted","Data":"3df3f5833c5cdb4a1c7b5b2236a4f4786eca712945f32218dd228bd723839960"} Feb 17 09:21:54 crc kubenswrapper[4848]: I0217 09:21:54.490586 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" podStartSLOduration=4.490571325 podStartE2EDuration="4.490571325s" podCreationTimestamp="2026-02-17 09:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:21:54.4859243 +0000 UTC m=+992.029179946" watchObservedRunningTime="2026-02-17 09:21:54.490571325 +0000 UTC m=+992.033826961" Feb 17 09:21:54 crc kubenswrapper[4848]: I0217 09:21:54.505449 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6cb4bc9fb8-r52gj" podStartSLOduration=3.505430187 podStartE2EDuration="3.505430187s" podCreationTimestamp="2026-02-17 09:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:21:54.505172359 +0000 UTC m=+992.048428005" watchObservedRunningTime="2026-02-17 09:21:54.505430187 +0000 UTC m=+992.048685833" Feb 17 09:21:54 crc kubenswrapper[4848]: I0217 09:21:54.899479 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-775dd6779c-j9k6n" Feb 17 09:21:55 crc kubenswrapper[4848]: I0217 09:21:55.492696 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5df76f45d5-hgxv9" event={"ID":"e37f2d41-bace-4ca3-a811-36b44ee278d4","Type":"ContainerStarted","Data":"32eabd0988df57cdd830fc15c053b758ac4faa7582ea065e28d9c0b5f4c53649"} Feb 17 09:21:55 crc kubenswrapper[4848]: I0217 09:21:55.493433 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:21:55 crc kubenswrapper[4848]: I0217 09:21:55.493887 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5df76f45d5-hgxv9" event={"ID":"e37f2d41-bace-4ca3-a811-36b44ee278d4","Type":"ContainerStarted","Data":"7b28b1b7d740ab64c25cb38758987da296dc7a5cbf9b99c3e43634575486d07b"} Feb 17 09:21:55 crc kubenswrapper[4848]: I0217 09:21:55.500458 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c8222b5-0870-4866-bbc5-2017d0c92585","Type":"ContainerStarted","Data":"1d632eddee1987ee91aa3f4ae92d1677da235cfcb722e6d98aa4df16b43d3e9a"} Feb 17 09:21:55 crc kubenswrapper[4848]: I0217 09:21:55.500778 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8c8222b5-0870-4866-bbc5-2017d0c92585" containerName="glance-log" containerID="cri-o://80020213f66ca77df170458f285fbdbaefff874e8b7e7ed1a0567df823777379" gracePeriod=30 Feb 17 09:21:55 crc kubenswrapper[4848]: I0217 09:21:55.500922 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8c8222b5-0870-4866-bbc5-2017d0c92585" containerName="glance-httpd" containerID="cri-o://1d632eddee1987ee91aa3f4ae92d1677da235cfcb722e6d98aa4df16b43d3e9a" gracePeriod=30 Feb 17 09:21:55 crc kubenswrapper[4848]: I0217 09:21:55.510347 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c9a50ad8-91ed-455d-9810-6f0dc1fb03f0" containerName="glance-log" containerID="cri-o://df3868e8ac577f0318c601dc6206914260d56d56fa100eef6d537919257171fd" gracePeriod=30 Feb 17 09:21:55 crc kubenswrapper[4848]: I0217 09:21:55.510656 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0","Type":"ContainerStarted","Data":"ae2205f1224eb6f1e65de3b96418f13cae6aa0021f4083ca8a63d1275a509ea4"} Feb 17 09:21:55 crc kubenswrapper[4848]: I0217 09:21:55.511322 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="c9a50ad8-91ed-455d-9810-6f0dc1fb03f0" containerName="glance-httpd" containerID="cri-o://ae2205f1224eb6f1e65de3b96418f13cae6aa0021f4083ca8a63d1275a509ea4" gracePeriod=30 Feb 17 09:21:55 crc kubenswrapper[4848]: I0217 09:21:55.526782 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5df76f45d5-hgxv9" podStartSLOduration=2.526737289 podStartE2EDuration="2.526737289s" podCreationTimestamp="2026-02-17 09:21:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:21:55.515450852 +0000 UTC m=+993.058706508" watchObservedRunningTime="2026-02-17 09:21:55.526737289 +0000 UTC m=+993.069992935" Feb 17 09:21:55 crc kubenswrapper[4848]: I0217 09:21:55.551718 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=25.551692004 podStartE2EDuration="25.551692004s" podCreationTimestamp="2026-02-17 09:21:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:21:55.539065717 +0000 UTC m=+993.082321383" watchObservedRunningTime="2026-02-17 09:21:55.551692004 +0000 UTC m=+993.094947670" Feb 17 09:21:55 crc kubenswrapper[4848]: I0217 09:21:55.575129 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=24.575108594 podStartE2EDuration="24.575108594s" podCreationTimestamp="2026-02-17 09:21:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:21:55.564309131 +0000 UTC m=+993.107564797" watchObservedRunningTime="2026-02-17 09:21:55.575108594 +0000 UTC m=+993.118364230" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.198719 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.344028 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-scripts\") pod \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.344122 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-logs\") pod \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.344140 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-combined-ca-bundle\") pod \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.344166 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-internal-tls-certs\") pod \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.344209 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-httpd-run\") pod \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.344240 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-config-data\") pod \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.344308 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.344355 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qnrd\" (UniqueName: \"kubernetes.io/projected/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-kube-api-access-8qnrd\") pod \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\" (UID: \"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0\") " Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.344600 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-logs" (OuterVolumeSpecName: "logs") pod "c9a50ad8-91ed-455d-9810-6f0dc1fb03f0" (UID: "c9a50ad8-91ed-455d-9810-6f0dc1fb03f0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.344973 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-logs\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.345256 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c9a50ad8-91ed-455d-9810-6f0dc1fb03f0" (UID: "c9a50ad8-91ed-455d-9810-6f0dc1fb03f0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.353134 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-scripts" (OuterVolumeSpecName: "scripts") pod "c9a50ad8-91ed-455d-9810-6f0dc1fb03f0" (UID: "c9a50ad8-91ed-455d-9810-6f0dc1fb03f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.358655 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-kube-api-access-8qnrd" (OuterVolumeSpecName: "kube-api-access-8qnrd") pod "c9a50ad8-91ed-455d-9810-6f0dc1fb03f0" (UID: "c9a50ad8-91ed-455d-9810-6f0dc1fb03f0"). InnerVolumeSpecName "kube-api-access-8qnrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.368248 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "c9a50ad8-91ed-455d-9810-6f0dc1fb03f0" (UID: "c9a50ad8-91ed-455d-9810-6f0dc1fb03f0"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.406067 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c9a50ad8-91ed-455d-9810-6f0dc1fb03f0" (UID: "c9a50ad8-91ed-455d-9810-6f0dc1fb03f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.408999 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-config-data" (OuterVolumeSpecName: "config-data") pod "c9a50ad8-91ed-455d-9810-6f0dc1fb03f0" (UID: "c9a50ad8-91ed-455d-9810-6f0dc1fb03f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.447152 4848 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.447194 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qnrd\" (UniqueName: \"kubernetes.io/projected/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-kube-api-access-8qnrd\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.447209 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.447221 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.447232 4848 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.447243 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.466702 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c9a50ad8-91ed-455d-9810-6f0dc1fb03f0" (UID: "c9a50ad8-91ed-455d-9810-6f0dc1fb03f0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.481336 4848 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.525971 4848 generic.go:334] "Generic (PLEG): container finished" podID="8c8222b5-0870-4866-bbc5-2017d0c92585" containerID="1d632eddee1987ee91aa3f4ae92d1677da235cfcb722e6d98aa4df16b43d3e9a" exitCode=0 Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.525998 4848 generic.go:334] "Generic (PLEG): container finished" podID="8c8222b5-0870-4866-bbc5-2017d0c92585" containerID="80020213f66ca77df170458f285fbdbaefff874e8b7e7ed1a0567df823777379" exitCode=143 Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.526055 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c8222b5-0870-4866-bbc5-2017d0c92585","Type":"ContainerDied","Data":"1d632eddee1987ee91aa3f4ae92d1677da235cfcb722e6d98aa4df16b43d3e9a"} Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.526099 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c8222b5-0870-4866-bbc5-2017d0c92585","Type":"ContainerDied","Data":"80020213f66ca77df170458f285fbdbaefff874e8b7e7ed1a0567df823777379"} Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.528039 4848 generic.go:334] "Generic (PLEG): container finished" podID="c9a50ad8-91ed-455d-9810-6f0dc1fb03f0" containerID="ae2205f1224eb6f1e65de3b96418f13cae6aa0021f4083ca8a63d1275a509ea4" exitCode=0 Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.528058 4848 generic.go:334] "Generic (PLEG): container finished" podID="c9a50ad8-91ed-455d-9810-6f0dc1fb03f0" containerID="df3868e8ac577f0318c601dc6206914260d56d56fa100eef6d537919257171fd" exitCode=143 Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.528869 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.529750 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0","Type":"ContainerDied","Data":"ae2205f1224eb6f1e65de3b96418f13cae6aa0021f4083ca8a63d1275a509ea4"} Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.529821 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0","Type":"ContainerDied","Data":"df3868e8ac577f0318c601dc6206914260d56d56fa100eef6d537919257171fd"} Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.529831 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"c9a50ad8-91ed-455d-9810-6f0dc1fb03f0","Type":"ContainerDied","Data":"d85988bde8daa1e446cab6f039951b7497a2918bf692d890663ac5d3155ad02e"} Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.529861 4848 scope.go:117] "RemoveContainer" containerID="ae2205f1224eb6f1e65de3b96418f13cae6aa0021f4083ca8a63d1275a509ea4" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.548624 4848 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.548661 4848 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.617434 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.627903 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.641600 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 09:21:56 crc kubenswrapper[4848]: E0217 09:21:56.642122 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a50ad8-91ed-455d-9810-6f0dc1fb03f0" containerName="glance-log" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.642138 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a50ad8-91ed-455d-9810-6f0dc1fb03f0" containerName="glance-log" Feb 17 09:21:56 crc kubenswrapper[4848]: E0217 09:21:56.642158 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9a50ad8-91ed-455d-9810-6f0dc1fb03f0" containerName="glance-httpd" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.642164 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9a50ad8-91ed-455d-9810-6f0dc1fb03f0" containerName="glance-httpd" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.642354 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9a50ad8-91ed-455d-9810-6f0dc1fb03f0" containerName="glance-httpd" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.642375 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9a50ad8-91ed-455d-9810-6f0dc1fb03f0" containerName="glance-log" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.644328 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.646952 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.648621 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.653528 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.751626 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.751668 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-logs\") pod \"glance-default-internal-api-0\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.751691 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.751707 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.751844 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.751872 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.751944 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.752021 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph4q7\" (UniqueName: \"kubernetes.io/projected/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-kube-api-access-ph4q7\") pod \"glance-default-internal-api-0\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.853954 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-logs\") pod \"glance-default-internal-api-0\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.854007 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.854053 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.854146 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.854203 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.854242 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.854304 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph4q7\" (UniqueName: \"kubernetes.io/projected/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-kube-api-access-ph4q7\") pod \"glance-default-internal-api-0\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.854394 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.854428 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.855149 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-logs\") pod \"glance-default-internal-api-0\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.855357 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.860327 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.860962 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.861889 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.866328 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.874162 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph4q7\" (UniqueName: \"kubernetes.io/projected/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-kube-api-access-ph4q7\") pod \"glance-default-internal-api-0\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.885473 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:21:56 crc kubenswrapper[4848]: I0217 09:21:56.977833 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.224811 4848 scope.go:117] "RemoveContainer" containerID="df3868e8ac577f0318c601dc6206914260d56d56fa100eef6d537919257171fd" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.237203 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.273688 4848 scope.go:117] "RemoveContainer" containerID="ae2205f1224eb6f1e65de3b96418f13cae6aa0021f4083ca8a63d1275a509ea4" Feb 17 09:21:57 crc kubenswrapper[4848]: E0217 09:21:57.284485 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae2205f1224eb6f1e65de3b96418f13cae6aa0021f4083ca8a63d1275a509ea4\": container with ID starting with ae2205f1224eb6f1e65de3b96418f13cae6aa0021f4083ca8a63d1275a509ea4 not found: ID does not exist" containerID="ae2205f1224eb6f1e65de3b96418f13cae6aa0021f4083ca8a63d1275a509ea4" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.284530 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae2205f1224eb6f1e65de3b96418f13cae6aa0021f4083ca8a63d1275a509ea4"} err="failed to get container status \"ae2205f1224eb6f1e65de3b96418f13cae6aa0021f4083ca8a63d1275a509ea4\": rpc error: code = NotFound desc = could not find container \"ae2205f1224eb6f1e65de3b96418f13cae6aa0021f4083ca8a63d1275a509ea4\": container with ID starting with ae2205f1224eb6f1e65de3b96418f13cae6aa0021f4083ca8a63d1275a509ea4 not found: ID does not exist" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.284556 4848 scope.go:117] "RemoveContainer" containerID="df3868e8ac577f0318c601dc6206914260d56d56fa100eef6d537919257171fd" Feb 17 09:21:57 crc kubenswrapper[4848]: E0217 09:21:57.289122 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df3868e8ac577f0318c601dc6206914260d56d56fa100eef6d537919257171fd\": container with ID starting with df3868e8ac577f0318c601dc6206914260d56d56fa100eef6d537919257171fd not found: ID does not exist" containerID="df3868e8ac577f0318c601dc6206914260d56d56fa100eef6d537919257171fd" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.289178 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df3868e8ac577f0318c601dc6206914260d56d56fa100eef6d537919257171fd"} err="failed to get container status \"df3868e8ac577f0318c601dc6206914260d56d56fa100eef6d537919257171fd\": rpc error: code = NotFound desc = could not find container \"df3868e8ac577f0318c601dc6206914260d56d56fa100eef6d537919257171fd\": container with ID starting with df3868e8ac577f0318c601dc6206914260d56d56fa100eef6d537919257171fd not found: ID does not exist" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.289214 4848 scope.go:117] "RemoveContainer" containerID="ae2205f1224eb6f1e65de3b96418f13cae6aa0021f4083ca8a63d1275a509ea4" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.292908 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae2205f1224eb6f1e65de3b96418f13cae6aa0021f4083ca8a63d1275a509ea4"} err="failed to get container status \"ae2205f1224eb6f1e65de3b96418f13cae6aa0021f4083ca8a63d1275a509ea4\": rpc error: code = NotFound desc = could not find container \"ae2205f1224eb6f1e65de3b96418f13cae6aa0021f4083ca8a63d1275a509ea4\": container with ID starting with ae2205f1224eb6f1e65de3b96418f13cae6aa0021f4083ca8a63d1275a509ea4 not found: ID does not exist" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.292958 4848 scope.go:117] "RemoveContainer" containerID="df3868e8ac577f0318c601dc6206914260d56d56fa100eef6d537919257171fd" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.294337 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df3868e8ac577f0318c601dc6206914260d56d56fa100eef6d537919257171fd"} err="failed to get container status \"df3868e8ac577f0318c601dc6206914260d56d56fa100eef6d537919257171fd\": rpc error: code = NotFound desc = could not find container \"df3868e8ac577f0318c601dc6206914260d56d56fa100eef6d537919257171fd\": container with ID starting with df3868e8ac577f0318c601dc6206914260d56d56fa100eef6d537919257171fd not found: ID does not exist" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.379911 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hplkp\" (UniqueName: \"kubernetes.io/projected/8c8222b5-0870-4866-bbc5-2017d0c92585-kube-api-access-hplkp\") pod \"8c8222b5-0870-4866-bbc5-2017d0c92585\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.379959 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c8222b5-0870-4866-bbc5-2017d0c92585-httpd-run\") pod \"8c8222b5-0870-4866-bbc5-2017d0c92585\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.380001 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8222b5-0870-4866-bbc5-2017d0c92585-config-data\") pod \"8c8222b5-0870-4866-bbc5-2017d0c92585\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.380019 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c8222b5-0870-4866-bbc5-2017d0c92585-scripts\") pod \"8c8222b5-0870-4866-bbc5-2017d0c92585\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.380056 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8222b5-0870-4866-bbc5-2017d0c92585-public-tls-certs\") pod \"8c8222b5-0870-4866-bbc5-2017d0c92585\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.380094 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"8c8222b5-0870-4866-bbc5-2017d0c92585\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.380187 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c8222b5-0870-4866-bbc5-2017d0c92585-logs\") pod \"8c8222b5-0870-4866-bbc5-2017d0c92585\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.380209 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8222b5-0870-4866-bbc5-2017d0c92585-combined-ca-bundle\") pod \"8c8222b5-0870-4866-bbc5-2017d0c92585\" (UID: \"8c8222b5-0870-4866-bbc5-2017d0c92585\") " Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.387044 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c8222b5-0870-4866-bbc5-2017d0c92585-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8c8222b5-0870-4866-bbc5-2017d0c92585" (UID: "8c8222b5-0870-4866-bbc5-2017d0c92585"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.387198 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8c8222b5-0870-4866-bbc5-2017d0c92585-logs" (OuterVolumeSpecName: "logs") pod "8c8222b5-0870-4866-bbc5-2017d0c92585" (UID: "8c8222b5-0870-4866-bbc5-2017d0c92585"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.390935 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c8222b5-0870-4866-bbc5-2017d0c92585-kube-api-access-hplkp" (OuterVolumeSpecName: "kube-api-access-hplkp") pod "8c8222b5-0870-4866-bbc5-2017d0c92585" (UID: "8c8222b5-0870-4866-bbc5-2017d0c92585"). InnerVolumeSpecName "kube-api-access-hplkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.394182 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "8c8222b5-0870-4866-bbc5-2017d0c92585" (UID: "8c8222b5-0870-4866-bbc5-2017d0c92585"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.394755 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8222b5-0870-4866-bbc5-2017d0c92585-scripts" (OuterVolumeSpecName: "scripts") pod "8c8222b5-0870-4866-bbc5-2017d0c92585" (UID: "8c8222b5-0870-4866-bbc5-2017d0c92585"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.404953 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9a50ad8-91ed-455d-9810-6f0dc1fb03f0" path="/var/lib/kubelet/pods/c9a50ad8-91ed-455d-9810-6f0dc1fb03f0/volumes" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.431841 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8222b5-0870-4866-bbc5-2017d0c92585-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c8222b5-0870-4866-bbc5-2017d0c92585" (UID: "8c8222b5-0870-4866-bbc5-2017d0c92585"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.479899 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8222b5-0870-4866-bbc5-2017d0c92585-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8c8222b5-0870-4866-bbc5-2017d0c92585" (UID: "8c8222b5-0870-4866-bbc5-2017d0c92585"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.482136 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c8222b5-0870-4866-bbc5-2017d0c92585-logs\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.482166 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c8222b5-0870-4866-bbc5-2017d0c92585-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.482177 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hplkp\" (UniqueName: \"kubernetes.io/projected/8c8222b5-0870-4866-bbc5-2017d0c92585-kube-api-access-hplkp\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.482186 4848 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c8222b5-0870-4866-bbc5-2017d0c92585-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.482194 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c8222b5-0870-4866-bbc5-2017d0c92585-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.482203 4848 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c8222b5-0870-4866-bbc5-2017d0c92585-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.482224 4848 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.531864 4848 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.537320 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c8222b5-0870-4866-bbc5-2017d0c92585-config-data" (OuterVolumeSpecName: "config-data") pod "8c8222b5-0870-4866-bbc5-2017d0c92585" (UID: "8c8222b5-0870-4866-bbc5-2017d0c92585"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.579845 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c8222b5-0870-4866-bbc5-2017d0c92585","Type":"ContainerDied","Data":"f47faa6c5da2226694633f1385a602ec112fece4dfd654ac67e6d71d875e949d"} Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.579898 4848 scope.go:117] "RemoveContainer" containerID="1d632eddee1987ee91aa3f4ae92d1677da235cfcb722e6d98aa4df16b43d3e9a" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.580112 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.585035 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c8222b5-0870-4866-bbc5-2017d0c92585-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.585081 4848 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.640197 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.646532 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.673013 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 09:21:57 crc kubenswrapper[4848]: E0217 09:21:57.673545 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c8222b5-0870-4866-bbc5-2017d0c92585" containerName="glance-log" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.673612 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c8222b5-0870-4866-bbc5-2017d0c92585" containerName="glance-log" Feb 17 09:21:57 crc kubenswrapper[4848]: E0217 09:21:57.673691 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c8222b5-0870-4866-bbc5-2017d0c92585" containerName="glance-httpd" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.673749 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c8222b5-0870-4866-bbc5-2017d0c92585" containerName="glance-httpd" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.674015 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c8222b5-0870-4866-bbc5-2017d0c92585" containerName="glance-httpd" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.674106 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c8222b5-0870-4866-bbc5-2017d0c92585" containerName="glance-log" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.675260 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.677563 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.678917 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.694182 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.790648 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.791994 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-logs\") pod \"glance-default-external-api-0\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.792145 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d7l6\" (UniqueName: \"kubernetes.io/projected/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-kube-api-access-5d7l6\") pod \"glance-default-external-api-0\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.792196 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.792229 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.792442 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.792510 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-scripts\") pod \"glance-default-external-api-0\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.792562 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-config-data\") pod \"glance-default-external-api-0\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.854509 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.893826 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.893873 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-logs\") pod \"glance-default-external-api-0\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.893908 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d7l6\" (UniqueName: \"kubernetes.io/projected/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-kube-api-access-5d7l6\") pod \"glance-default-external-api-0\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.893929 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.893948 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.894004 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.894036 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-scripts\") pod \"glance-default-external-api-0\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.894056 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-config-data\") pod \"glance-default-external-api-0\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.895104 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-logs\") pod \"glance-default-external-api-0\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.895106 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.895495 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.899168 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-config-data\") pod \"glance-default-external-api-0\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.900352 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.901533 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-scripts\") pod \"glance-default-external-api-0\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.903440 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.913311 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d7l6\" (UniqueName: \"kubernetes.io/projected/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-kube-api-access-5d7l6\") pod \"glance-default-external-api-0\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:57 crc kubenswrapper[4848]: I0217 09:21:57.931904 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " pod="openstack/glance-default-external-api-0" Feb 17 09:21:58 crc kubenswrapper[4848]: I0217 09:21:58.013677 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 09:21:58 crc kubenswrapper[4848]: I0217 09:21:58.591465 4848 generic.go:334] "Generic (PLEG): container finished" podID="0dd600b5-f56b-460a-acb0-a4dc5fd2de23" containerID="88d0e2594cadd88bd48db0488f569383e741546126bbcdfc360e84ebf2c60057" exitCode=0 Feb 17 09:21:58 crc kubenswrapper[4848]: I0217 09:21:58.591655 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j6p6m" event={"ID":"0dd600b5-f56b-460a-acb0-a4dc5fd2de23","Type":"ContainerDied","Data":"88d0e2594cadd88bd48db0488f569383e741546126bbcdfc360e84ebf2c60057"} Feb 17 09:21:59 crc kubenswrapper[4848]: I0217 09:21:59.394480 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c8222b5-0870-4866-bbc5-2017d0c92585" path="/var/lib/kubelet/pods/8c8222b5-0870-4866-bbc5-2017d0c92585/volumes" Feb 17 09:22:01 crc kubenswrapper[4848]: I0217 09:22:01.330161 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" Feb 17 09:22:01 crc kubenswrapper[4848]: I0217 09:22:01.381685 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69c85d5ff7-758zs"] Feb 17 09:22:01 crc kubenswrapper[4848]: I0217 09:22:01.381931 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" podUID="5a53a69c-528b-4df5-9502-69316c3f345d" containerName="dnsmasq-dns" containerID="cri-o://b9c8eca747406d76a11c9d3666069d1d5f8e52aa19065a6c9ef06176f6369321" gracePeriod=10 Feb 17 09:22:01 crc kubenswrapper[4848]: I0217 09:22:01.615585 4848 scope.go:117] "RemoveContainer" containerID="80020213f66ca77df170458f285fbdbaefff874e8b7e7ed1a0567df823777379" Feb 17 09:22:01 crc kubenswrapper[4848]: I0217 09:22:01.654007 4848 generic.go:334] "Generic (PLEG): container finished" podID="5a53a69c-528b-4df5-9502-69316c3f345d" containerID="b9c8eca747406d76a11c9d3666069d1d5f8e52aa19065a6c9ef06176f6369321" exitCode=0 Feb 17 09:22:01 crc kubenswrapper[4848]: I0217 09:22:01.654110 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" event={"ID":"5a53a69c-528b-4df5-9502-69316c3f345d","Type":"ContainerDied","Data":"b9c8eca747406d76a11c9d3666069d1d5f8e52aa19065a6c9ef06176f6369321"} Feb 17 09:22:01 crc kubenswrapper[4848]: I0217 09:22:01.663141 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j6p6m" event={"ID":"0dd600b5-f56b-460a-acb0-a4dc5fd2de23","Type":"ContainerDied","Data":"909dcb6a6020b6abc1c40d93046ade2ae815508ecdd0313365ebf0fdba8020b9"} Feb 17 09:22:01 crc kubenswrapper[4848]: I0217 09:22:01.663176 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="909dcb6a6020b6abc1c40d93046ade2ae815508ecdd0313365ebf0fdba8020b9" Feb 17 09:22:01 crc kubenswrapper[4848]: I0217 09:22:01.691625 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4","Type":"ContainerStarted","Data":"fa5eabefbf28961d54f145f1b1ba5242223520cf7aa651859a2a86b19042bad2"} Feb 17 09:22:01 crc kubenswrapper[4848]: I0217 09:22:01.794947 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:22:01 crc kubenswrapper[4848]: I0217 09:22:01.795180 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:22:01 crc kubenswrapper[4848]: I0217 09:22:01.837932 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:22:01 crc kubenswrapper[4848]: I0217 09:22:01.838469 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:22:01 crc kubenswrapper[4848]: I0217 09:22:01.838933 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j6p6m" Feb 17 09:22:01 crc kubenswrapper[4848]: I0217 09:22:01.963567 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdrf9\" (UniqueName: \"kubernetes.io/projected/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-kube-api-access-gdrf9\") pod \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\" (UID: \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\") " Feb 17 09:22:01 crc kubenswrapper[4848]: I0217 09:22:01.963626 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-fernet-keys\") pod \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\" (UID: \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\") " Feb 17 09:22:01 crc kubenswrapper[4848]: I0217 09:22:01.963683 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-combined-ca-bundle\") pod \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\" (UID: \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\") " Feb 17 09:22:01 crc kubenswrapper[4848]: I0217 09:22:01.963839 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-credential-keys\") pod \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\" (UID: \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\") " Feb 17 09:22:01 crc kubenswrapper[4848]: I0217 09:22:01.963867 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-config-data\") pod \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\" (UID: \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\") " Feb 17 09:22:01 crc kubenswrapper[4848]: I0217 09:22:01.963901 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-scripts\") pod \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\" (UID: \"0dd600b5-f56b-460a-acb0-a4dc5fd2de23\") " Feb 17 09:22:01 crc kubenswrapper[4848]: I0217 09:22:01.975098 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-scripts" (OuterVolumeSpecName: "scripts") pod "0dd600b5-f56b-460a-acb0-a4dc5fd2de23" (UID: "0dd600b5-f56b-460a-acb0-a4dc5fd2de23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:01 crc kubenswrapper[4848]: I0217 09:22:01.975841 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0dd600b5-f56b-460a-acb0-a4dc5fd2de23" (UID: "0dd600b5-f56b-460a-acb0-a4dc5fd2de23"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:01 crc kubenswrapper[4848]: I0217 09:22:01.978499 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0dd600b5-f56b-460a-acb0-a4dc5fd2de23" (UID: "0dd600b5-f56b-460a-acb0-a4dc5fd2de23"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:01 crc kubenswrapper[4848]: I0217 09:22:01.991358 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-kube-api-access-gdrf9" (OuterVolumeSpecName: "kube-api-access-gdrf9") pod "0dd600b5-f56b-460a-acb0-a4dc5fd2de23" (UID: "0dd600b5-f56b-460a-acb0-a4dc5fd2de23"). InnerVolumeSpecName "kube-api-access-gdrf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.037781 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0dd600b5-f56b-460a-acb0-a4dc5fd2de23" (UID: "0dd600b5-f56b-460a-acb0-a4dc5fd2de23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.070155 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.070188 4848 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.070197 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.070205 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdrf9\" (UniqueName: \"kubernetes.io/projected/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-kube-api-access-gdrf9\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.070216 4848 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.091457 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.095376 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-config-data" (OuterVolumeSpecName: "config-data") pod "0dd600b5-f56b-460a-acb0-a4dc5fd2de23" (UID: "0dd600b5-f56b-460a-acb0-a4dc5fd2de23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.170695 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4v5bt\" (UniqueName: \"kubernetes.io/projected/5a53a69c-528b-4df5-9502-69316c3f345d-kube-api-access-4v5bt\") pod \"5a53a69c-528b-4df5-9502-69316c3f345d\" (UID: \"5a53a69c-528b-4df5-9502-69316c3f345d\") " Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.171202 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-config\") pod \"5a53a69c-528b-4df5-9502-69316c3f345d\" (UID: \"5a53a69c-528b-4df5-9502-69316c3f345d\") " Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.171328 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-dns-swift-storage-0\") pod \"5a53a69c-528b-4df5-9502-69316c3f345d\" (UID: \"5a53a69c-528b-4df5-9502-69316c3f345d\") " Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.171362 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-ovsdbserver-nb\") pod \"5a53a69c-528b-4df5-9502-69316c3f345d\" (UID: \"5a53a69c-528b-4df5-9502-69316c3f345d\") " Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.171409 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-ovsdbserver-sb\") pod \"5a53a69c-528b-4df5-9502-69316c3f345d\" (UID: \"5a53a69c-528b-4df5-9502-69316c3f345d\") " Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.171464 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-dns-svc\") pod \"5a53a69c-528b-4df5-9502-69316c3f345d\" (UID: \"5a53a69c-528b-4df5-9502-69316c3f345d\") " Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.171796 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd600b5-f56b-460a-acb0-a4dc5fd2de23-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.174704 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a53a69c-528b-4df5-9502-69316c3f345d-kube-api-access-4v5bt" (OuterVolumeSpecName: "kube-api-access-4v5bt") pod "5a53a69c-528b-4df5-9502-69316c3f345d" (UID: "5a53a69c-528b-4df5-9502-69316c3f345d"). InnerVolumeSpecName "kube-api-access-4v5bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.229055 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-844c79cc9c-crdch" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.273384 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4v5bt\" (UniqueName: \"kubernetes.io/projected/5a53a69c-528b-4df5-9502-69316c3f345d-kube-api-access-4v5bt\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.311867 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5a53a69c-528b-4df5-9502-69316c3f345d" (UID: "5a53a69c-528b-4df5-9502-69316c3f345d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.320340 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5a53a69c-528b-4df5-9502-69316c3f345d" (UID: "5a53a69c-528b-4df5-9502-69316c3f345d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.320414 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5a53a69c-528b-4df5-9502-69316c3f345d" (UID: "5a53a69c-528b-4df5-9502-69316c3f345d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.340291 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.346271 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5a53a69c-528b-4df5-9502-69316c3f345d" (UID: "5a53a69c-528b-4df5-9502-69316c3f345d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.353269 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-config" (OuterVolumeSpecName: "config") pod "5a53a69c-528b-4df5-9502-69316c3f345d" (UID: "5a53a69c-528b-4df5-9502-69316c3f345d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.375609 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.375642 4848 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.375656 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.375665 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.375680 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5a53a69c-528b-4df5-9502-69316c3f345d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.764485 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4","Type":"ContainerStarted","Data":"05231b5d6f5ec5cbc776f080a0317e40903d96f52f99d2628752bfa1c939efc7"} Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.766650 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e81fa19e-dc1a-4f09-8f05-b28099ae5f03","Type":"ContainerStarted","Data":"ed8525886dba4381a8ff46086a4bf6ff83e3beaa89f6091e6d44a5eff3cd732a"} Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.768777 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" event={"ID":"5a53a69c-528b-4df5-9502-69316c3f345d","Type":"ContainerDied","Data":"f9051a4a4a8a18de9820f7f57278f0021e79f8581ae2c476caa0f6cfb3fb89bd"} Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.768825 4848 scope.go:117] "RemoveContainer" containerID="b9c8eca747406d76a11c9d3666069d1d5f8e52aa19065a6c9ef06176f6369321" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.768916 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69c85d5ff7-758zs" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.784089 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ptthx" event={"ID":"9e0d87d6-fd65-4565-971f-070050f2f9ff","Type":"ContainerStarted","Data":"e0a8bd53dc65343b0d928854371b7e08c32dce5bf61b758ff61ed51429aa128d"} Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.804950 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-ptthx" podStartSLOduration=2.839319601 podStartE2EDuration="40.804932964s" podCreationTimestamp="2026-02-17 09:21:22 +0000 UTC" firstStartedPulling="2026-02-17 09:21:23.649935056 +0000 UTC m=+961.193190693" lastFinishedPulling="2026-02-17 09:22:01.61554841 +0000 UTC m=+999.158804056" observedRunningTime="2026-02-17 09:22:02.800936188 +0000 UTC m=+1000.344191834" watchObservedRunningTime="2026-02-17 09:22:02.804932964 +0000 UTC m=+1000.348188610" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.818828 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j6p6m" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.824561 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19facc80-e9df-42dc-8124-7619b2167b5c","Type":"ContainerStarted","Data":"41572746e34d178fd02daf01a2fe3cb41b45a45e18e1670540f5d2845cab6370"} Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.849274 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69c85d5ff7-758zs"] Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.860145 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69c85d5ff7-758zs"] Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.860686 4848 scope.go:117] "RemoveContainer" containerID="8f981bde4acf13c57b7568f1a14677c9dde5003043b88fe191980311b72a15a4" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.964450 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-76c7ffd8bf-x42cc"] Feb 17 09:22:02 crc kubenswrapper[4848]: E0217 09:22:02.964986 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dd600b5-f56b-460a-acb0-a4dc5fd2de23" containerName="keystone-bootstrap" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.965112 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd600b5-f56b-460a-acb0-a4dc5fd2de23" containerName="keystone-bootstrap" Feb 17 09:22:02 crc kubenswrapper[4848]: E0217 09:22:02.965194 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a53a69c-528b-4df5-9502-69316c3f345d" containerName="init" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.965248 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a53a69c-528b-4df5-9502-69316c3f345d" containerName="init" Feb 17 09:22:02 crc kubenswrapper[4848]: E0217 09:22:02.965307 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a53a69c-528b-4df5-9502-69316c3f345d" containerName="dnsmasq-dns" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.965354 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a53a69c-528b-4df5-9502-69316c3f345d" containerName="dnsmasq-dns" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.965557 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a53a69c-528b-4df5-9502-69316c3f345d" containerName="dnsmasq-dns" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.965630 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dd600b5-f56b-460a-acb0-a4dc5fd2de23" containerName="keystone-bootstrap" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.966267 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.985407 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.985578 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.985687 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.985807 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.985921 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 17 09:22:02 crc kubenswrapper[4848]: I0217 09:22:02.986174 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-p6mz5" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.034365 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-76c7ffd8bf-x42cc"] Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.117123 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc74976d-87c5-406c-9e25-4f89c5fc2307-public-tls-certs\") pod \"keystone-76c7ffd8bf-x42cc\" (UID: \"fc74976d-87c5-406c-9e25-4f89c5fc2307\") " pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.117318 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc74976d-87c5-406c-9e25-4f89c5fc2307-combined-ca-bundle\") pod \"keystone-76c7ffd8bf-x42cc\" (UID: \"fc74976d-87c5-406c-9e25-4f89c5fc2307\") " pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.117585 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpmcq\" (UniqueName: \"kubernetes.io/projected/fc74976d-87c5-406c-9e25-4f89c5fc2307-kube-api-access-lpmcq\") pod \"keystone-76c7ffd8bf-x42cc\" (UID: \"fc74976d-87c5-406c-9e25-4f89c5fc2307\") " pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.117654 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc74976d-87c5-406c-9e25-4f89c5fc2307-config-data\") pod \"keystone-76c7ffd8bf-x42cc\" (UID: \"fc74976d-87c5-406c-9e25-4f89c5fc2307\") " pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.117727 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fc74976d-87c5-406c-9e25-4f89c5fc2307-credential-keys\") pod \"keystone-76c7ffd8bf-x42cc\" (UID: \"fc74976d-87c5-406c-9e25-4f89c5fc2307\") " pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.117841 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc74976d-87c5-406c-9e25-4f89c5fc2307-internal-tls-certs\") pod \"keystone-76c7ffd8bf-x42cc\" (UID: \"fc74976d-87c5-406c-9e25-4f89c5fc2307\") " pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.117918 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc74976d-87c5-406c-9e25-4f89c5fc2307-scripts\") pod \"keystone-76c7ffd8bf-x42cc\" (UID: \"fc74976d-87c5-406c-9e25-4f89c5fc2307\") " pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.118007 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fc74976d-87c5-406c-9e25-4f89c5fc2307-fernet-keys\") pod \"keystone-76c7ffd8bf-x42cc\" (UID: \"fc74976d-87c5-406c-9e25-4f89c5fc2307\") " pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.225055 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpmcq\" (UniqueName: \"kubernetes.io/projected/fc74976d-87c5-406c-9e25-4f89c5fc2307-kube-api-access-lpmcq\") pod \"keystone-76c7ffd8bf-x42cc\" (UID: \"fc74976d-87c5-406c-9e25-4f89c5fc2307\") " pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.225096 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc74976d-87c5-406c-9e25-4f89c5fc2307-config-data\") pod \"keystone-76c7ffd8bf-x42cc\" (UID: \"fc74976d-87c5-406c-9e25-4f89c5fc2307\") " pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.225123 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fc74976d-87c5-406c-9e25-4f89c5fc2307-credential-keys\") pod \"keystone-76c7ffd8bf-x42cc\" (UID: \"fc74976d-87c5-406c-9e25-4f89c5fc2307\") " pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.225144 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc74976d-87c5-406c-9e25-4f89c5fc2307-internal-tls-certs\") pod \"keystone-76c7ffd8bf-x42cc\" (UID: \"fc74976d-87c5-406c-9e25-4f89c5fc2307\") " pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.225170 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc74976d-87c5-406c-9e25-4f89c5fc2307-scripts\") pod \"keystone-76c7ffd8bf-x42cc\" (UID: \"fc74976d-87c5-406c-9e25-4f89c5fc2307\") " pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.225202 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fc74976d-87c5-406c-9e25-4f89c5fc2307-fernet-keys\") pod \"keystone-76c7ffd8bf-x42cc\" (UID: \"fc74976d-87c5-406c-9e25-4f89c5fc2307\") " pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.225220 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc74976d-87c5-406c-9e25-4f89c5fc2307-public-tls-certs\") pod \"keystone-76c7ffd8bf-x42cc\" (UID: \"fc74976d-87c5-406c-9e25-4f89c5fc2307\") " pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.225236 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc74976d-87c5-406c-9e25-4f89c5fc2307-combined-ca-bundle\") pod \"keystone-76c7ffd8bf-x42cc\" (UID: \"fc74976d-87c5-406c-9e25-4f89c5fc2307\") " pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.252518 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc74976d-87c5-406c-9e25-4f89c5fc2307-config-data\") pod \"keystone-76c7ffd8bf-x42cc\" (UID: \"fc74976d-87c5-406c-9e25-4f89c5fc2307\") " pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.252931 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fc74976d-87c5-406c-9e25-4f89c5fc2307-credential-keys\") pod \"keystone-76c7ffd8bf-x42cc\" (UID: \"fc74976d-87c5-406c-9e25-4f89c5fc2307\") " pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.253262 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc74976d-87c5-406c-9e25-4f89c5fc2307-internal-tls-certs\") pod \"keystone-76c7ffd8bf-x42cc\" (UID: \"fc74976d-87c5-406c-9e25-4f89c5fc2307\") " pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.259866 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fc74976d-87c5-406c-9e25-4f89c5fc2307-fernet-keys\") pod \"keystone-76c7ffd8bf-x42cc\" (UID: \"fc74976d-87c5-406c-9e25-4f89c5fc2307\") " pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.280042 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc74976d-87c5-406c-9e25-4f89c5fc2307-public-tls-certs\") pod \"keystone-76c7ffd8bf-x42cc\" (UID: \"fc74976d-87c5-406c-9e25-4f89c5fc2307\") " pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.282891 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpmcq\" (UniqueName: \"kubernetes.io/projected/fc74976d-87c5-406c-9e25-4f89c5fc2307-kube-api-access-lpmcq\") pod \"keystone-76c7ffd8bf-x42cc\" (UID: \"fc74976d-87c5-406c-9e25-4f89c5fc2307\") " pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.286234 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc74976d-87c5-406c-9e25-4f89c5fc2307-scripts\") pod \"keystone-76c7ffd8bf-x42cc\" (UID: \"fc74976d-87c5-406c-9e25-4f89c5fc2307\") " pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.310222 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc74976d-87c5-406c-9e25-4f89c5fc2307-combined-ca-bundle\") pod \"keystone-76c7ffd8bf-x42cc\" (UID: \"fc74976d-87c5-406c-9e25-4f89c5fc2307\") " pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.337485 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.406947 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a53a69c-528b-4df5-9502-69316c3f345d" path="/var/lib/kubelet/pods/5a53a69c-528b-4df5-9502-69316c3f345d/volumes" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.847044 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4","Type":"ContainerStarted","Data":"7df204035a6f669c198c02047c8cbb1aabf451d20bed321fab8885c38c9f4247"} Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.858582 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-76c7ffd8bf-x42cc"] Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.871607 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e81fa19e-dc1a-4f09-8f05-b28099ae5f03","Type":"ContainerStarted","Data":"c24f289f8c51994798352ea14feaba690fc4d2d2b20d674b644427de659a078d"} Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.886784 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.883751548 podStartE2EDuration="7.883751548s" podCreationTimestamp="2026-02-17 09:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:22:03.877684361 +0000 UTC m=+1001.420940007" watchObservedRunningTime="2026-02-17 09:22:03.883751548 +0000 UTC m=+1001.427007194" Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.910257 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jq6l8" event={"ID":"dcb8e782-fe60-4c41-a843-1980fc8ab3cc","Type":"ContainerStarted","Data":"2b466d5e14ae33a601d99defdc04b1b4d7027f30f25820940d7dcb74da3162bc"} Feb 17 09:22:03 crc kubenswrapper[4848]: I0217 09:22:03.946060 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-jq6l8" podStartSLOduration=2.420129696 podStartE2EDuration="41.946038417s" podCreationTimestamp="2026-02-17 09:21:22 +0000 UTC" firstStartedPulling="2026-02-17 09:21:23.408500184 +0000 UTC m=+960.951755830" lastFinishedPulling="2026-02-17 09:22:02.934408905 +0000 UTC m=+1000.477664551" observedRunningTime="2026-02-17 09:22:03.933265276 +0000 UTC m=+1001.476520922" watchObservedRunningTime="2026-02-17 09:22:03.946038417 +0000 UTC m=+1001.489294063" Feb 17 09:22:04 crc kubenswrapper[4848]: I0217 09:22:04.936825 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e81fa19e-dc1a-4f09-8f05-b28099ae5f03","Type":"ContainerStarted","Data":"0a78c085be1066ed063fef22ab745f758a882d08a488417f71ebd578f2619f8f"} Feb 17 09:22:04 crc kubenswrapper[4848]: I0217 09:22:04.946506 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-76c7ffd8bf-x42cc" event={"ID":"fc74976d-87c5-406c-9e25-4f89c5fc2307","Type":"ContainerStarted","Data":"aba7c58837bbc5735b1ecc0721caed7f4e5b65a722312bdd2d80c6f376c71e68"} Feb 17 09:22:04 crc kubenswrapper[4848]: I0217 09:22:04.946559 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:04 crc kubenswrapper[4848]: I0217 09:22:04.946572 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-76c7ffd8bf-x42cc" event={"ID":"fc74976d-87c5-406c-9e25-4f89c5fc2307","Type":"ContainerStarted","Data":"6e5040e5f8ec685a3daed8dfb9684f9e198978c16a97a25b3eb82cef38f22858"} Feb 17 09:22:04 crc kubenswrapper[4848]: I0217 09:22:04.968846 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.968832652 podStartE2EDuration="7.968832652s" podCreationTimestamp="2026-02-17 09:21:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:22:04.967173364 +0000 UTC m=+1002.510429010" watchObservedRunningTime="2026-02-17 09:22:04.968832652 +0000 UTC m=+1002.512088298" Feb 17 09:22:04 crc kubenswrapper[4848]: I0217 09:22:04.990307 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-76c7ffd8bf-x42cc" podStartSLOduration=2.990289035 podStartE2EDuration="2.990289035s" podCreationTimestamp="2026-02-17 09:22:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:22:04.990157991 +0000 UTC m=+1002.533413637" watchObservedRunningTime="2026-02-17 09:22:04.990289035 +0000 UTC m=+1002.533544681" Feb 17 09:22:05 crc kubenswrapper[4848]: I0217 09:22:05.995343 4848 generic.go:334] "Generic (PLEG): container finished" podID="9e0d87d6-fd65-4565-971f-070050f2f9ff" containerID="e0a8bd53dc65343b0d928854371b7e08c32dce5bf61b758ff61ed51429aa128d" exitCode=0 Feb 17 09:22:05 crc kubenswrapper[4848]: I0217 09:22:05.997067 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ptthx" event={"ID":"9e0d87d6-fd65-4565-971f-070050f2f9ff","Type":"ContainerDied","Data":"e0a8bd53dc65343b0d928854371b7e08c32dce5bf61b758ff61ed51429aa128d"} Feb 17 09:22:06 crc kubenswrapper[4848]: I0217 09:22:06.978616 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 09:22:06 crc kubenswrapper[4848]: I0217 09:22:06.978892 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 09:22:07 crc kubenswrapper[4848]: I0217 09:22:07.016348 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lh5dl" event={"ID":"ba8c67b5-216a-4f60-baee-c1b6211f89ec","Type":"ContainerStarted","Data":"b017242cfa16fac3f6a3292dc1d6ae30e4633ac9f56cd1d03d93c208f868655b"} Feb 17 09:22:07 crc kubenswrapper[4848]: I0217 09:22:07.032700 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-lh5dl" podStartSLOduration=3.395600124 podStartE2EDuration="46.032685875s" podCreationTimestamp="2026-02-17 09:21:21 +0000 UTC" firstStartedPulling="2026-02-17 09:21:23.232858653 +0000 UTC m=+960.776114299" lastFinishedPulling="2026-02-17 09:22:05.869944414 +0000 UTC m=+1003.413200050" observedRunningTime="2026-02-17 09:22:07.032442758 +0000 UTC m=+1004.575698404" watchObservedRunningTime="2026-02-17 09:22:07.032685875 +0000 UTC m=+1004.575941521" Feb 17 09:22:07 crc kubenswrapper[4848]: I0217 09:22:07.034173 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 09:22:07 crc kubenswrapper[4848]: I0217 09:22:07.036628 4848 generic.go:334] "Generic (PLEG): container finished" podID="dcb8e782-fe60-4c41-a843-1980fc8ab3cc" containerID="2b466d5e14ae33a601d99defdc04b1b4d7027f30f25820940d7dcb74da3162bc" exitCode=0 Feb 17 09:22:07 crc kubenswrapper[4848]: I0217 09:22:07.036815 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jq6l8" event={"ID":"dcb8e782-fe60-4c41-a843-1980fc8ab3cc","Type":"ContainerDied","Data":"2b466d5e14ae33a601d99defdc04b1b4d7027f30f25820940d7dcb74da3162bc"} Feb 17 09:22:07 crc kubenswrapper[4848]: I0217 09:22:07.037111 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 09:22:07 crc kubenswrapper[4848]: I0217 09:22:07.096606 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 09:22:08 crc kubenswrapper[4848]: I0217 09:22:08.013901 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 09:22:08 crc kubenswrapper[4848]: I0217 09:22:08.014263 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 09:22:08 crc kubenswrapper[4848]: I0217 09:22:08.061242 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 09:22:08 crc kubenswrapper[4848]: I0217 09:22:08.061294 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 09:22:08 crc kubenswrapper[4848]: I0217 09:22:08.061729 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 09:22:08 crc kubenswrapper[4848]: I0217 09:22:08.113177 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 09:22:09 crc kubenswrapper[4848]: I0217 09:22:09.081163 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 09:22:10 crc kubenswrapper[4848]: I0217 09:22:10.095250 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 09:22:10 crc kubenswrapper[4848]: I0217 09:22:10.095277 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 09:22:10 crc kubenswrapper[4848]: I0217 09:22:10.434667 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 09:22:10 crc kubenswrapper[4848]: I0217 09:22:10.435652 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 09:22:11 crc kubenswrapper[4848]: I0217 09:22:11.500809 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 09:22:11 crc kubenswrapper[4848]: I0217 09:22:11.501214 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 09:22:11 crc kubenswrapper[4848]: I0217 09:22:11.508644 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 09:22:11 crc kubenswrapper[4848]: I0217 09:22:11.795381 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-749cc47784-q9crv" podUID="1068aa99-55d4-4778-ac02-b354de25d16e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Feb 17 09:22:11 crc kubenswrapper[4848]: I0217 09:22:11.838339 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-676bdd79dd-lq228" podUID="96fd6f0e-96ad-4a88-85ff-78f450b24279" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.128819 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-ptthx" event={"ID":"9e0d87d6-fd65-4565-971f-070050f2f9ff","Type":"ContainerDied","Data":"5bc983f81013248609bcd710557e0ae5b6eae67846db99d4bd96bce658eaa39e"} Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.129160 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bc983f81013248609bcd710557e0ae5b6eae67846db99d4bd96bce658eaa39e" Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.130524 4848 generic.go:334] "Generic (PLEG): container finished" podID="ba8c67b5-216a-4f60-baee-c1b6211f89ec" containerID="b017242cfa16fac3f6a3292dc1d6ae30e4633ac9f56cd1d03d93c208f868655b" exitCode=0 Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.130567 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lh5dl" event={"ID":"ba8c67b5-216a-4f60-baee-c1b6211f89ec","Type":"ContainerDied","Data":"b017242cfa16fac3f6a3292dc1d6ae30e4633ac9f56cd1d03d93c208f868655b"} Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.132268 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-jq6l8" event={"ID":"dcb8e782-fe60-4c41-a843-1980fc8ab3cc","Type":"ContainerDied","Data":"45f640070fa3ef2f5dda9892b6cb7dd122c44c384928d27ce8e0aa46dea0cfef"} Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.132303 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45f640070fa3ef2f5dda9892b6cb7dd122c44c384928d27ce8e0aa46dea0cfef" Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.165751 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jq6l8" Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.182712 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ptthx" Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.250190 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e0d87d6-fd65-4565-971f-070050f2f9ff-combined-ca-bundle\") pod \"9e0d87d6-fd65-4565-971f-070050f2f9ff\" (UID: \"9e0d87d6-fd65-4565-971f-070050f2f9ff\") " Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.250272 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e0d87d6-fd65-4565-971f-070050f2f9ff-scripts\") pod \"9e0d87d6-fd65-4565-971f-070050f2f9ff\" (UID: \"9e0d87d6-fd65-4565-971f-070050f2f9ff\") " Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.250333 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e0d87d6-fd65-4565-971f-070050f2f9ff-logs\") pod \"9e0d87d6-fd65-4565-971f-070050f2f9ff\" (UID: \"9e0d87d6-fd65-4565-971f-070050f2f9ff\") " Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.250361 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dcb8e782-fe60-4c41-a843-1980fc8ab3cc-db-sync-config-data\") pod \"dcb8e782-fe60-4c41-a843-1980fc8ab3cc\" (UID: \"dcb8e782-fe60-4c41-a843-1980fc8ab3cc\") " Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.250417 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d24gr\" (UniqueName: \"kubernetes.io/projected/dcb8e782-fe60-4c41-a843-1980fc8ab3cc-kube-api-access-d24gr\") pod \"dcb8e782-fe60-4c41-a843-1980fc8ab3cc\" (UID: \"dcb8e782-fe60-4c41-a843-1980fc8ab3cc\") " Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.250453 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njvmh\" (UniqueName: \"kubernetes.io/projected/9e0d87d6-fd65-4565-971f-070050f2f9ff-kube-api-access-njvmh\") pod \"9e0d87d6-fd65-4565-971f-070050f2f9ff\" (UID: \"9e0d87d6-fd65-4565-971f-070050f2f9ff\") " Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.250489 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e0d87d6-fd65-4565-971f-070050f2f9ff-config-data\") pod \"9e0d87d6-fd65-4565-971f-070050f2f9ff\" (UID: \"9e0d87d6-fd65-4565-971f-070050f2f9ff\") " Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.250577 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb8e782-fe60-4c41-a843-1980fc8ab3cc-combined-ca-bundle\") pod \"dcb8e782-fe60-4c41-a843-1980fc8ab3cc\" (UID: \"dcb8e782-fe60-4c41-a843-1980fc8ab3cc\") " Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.250684 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e0d87d6-fd65-4565-971f-070050f2f9ff-logs" (OuterVolumeSpecName: "logs") pod "9e0d87d6-fd65-4565-971f-070050f2f9ff" (UID: "9e0d87d6-fd65-4565-971f-070050f2f9ff"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.250944 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e0d87d6-fd65-4565-971f-070050f2f9ff-logs\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.256795 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e0d87d6-fd65-4565-971f-070050f2f9ff-kube-api-access-njvmh" (OuterVolumeSpecName: "kube-api-access-njvmh") pod "9e0d87d6-fd65-4565-971f-070050f2f9ff" (UID: "9e0d87d6-fd65-4565-971f-070050f2f9ff"). InnerVolumeSpecName "kube-api-access-njvmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.256930 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcb8e782-fe60-4c41-a843-1980fc8ab3cc-kube-api-access-d24gr" (OuterVolumeSpecName: "kube-api-access-d24gr") pod "dcb8e782-fe60-4c41-a843-1980fc8ab3cc" (UID: "dcb8e782-fe60-4c41-a843-1980fc8ab3cc"). InnerVolumeSpecName "kube-api-access-d24gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.257422 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e0d87d6-fd65-4565-971f-070050f2f9ff-scripts" (OuterVolumeSpecName: "scripts") pod "9e0d87d6-fd65-4565-971f-070050f2f9ff" (UID: "9e0d87d6-fd65-4565-971f-070050f2f9ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.263255 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcb8e782-fe60-4c41-a843-1980fc8ab3cc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "dcb8e782-fe60-4c41-a843-1980fc8ab3cc" (UID: "dcb8e782-fe60-4c41-a843-1980fc8ab3cc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.279929 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dcb8e782-fe60-4c41-a843-1980fc8ab3cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dcb8e782-fe60-4c41-a843-1980fc8ab3cc" (UID: "dcb8e782-fe60-4c41-a843-1980fc8ab3cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.281851 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e0d87d6-fd65-4565-971f-070050f2f9ff-config-data" (OuterVolumeSpecName: "config-data") pod "9e0d87d6-fd65-4565-971f-070050f2f9ff" (UID: "9e0d87d6-fd65-4565-971f-070050f2f9ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.304639 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e0d87d6-fd65-4565-971f-070050f2f9ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e0d87d6-fd65-4565-971f-070050f2f9ff" (UID: "9e0d87d6-fd65-4565-971f-070050f2f9ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.352973 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d24gr\" (UniqueName: \"kubernetes.io/projected/dcb8e782-fe60-4c41-a843-1980fc8ab3cc-kube-api-access-d24gr\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.353250 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njvmh\" (UniqueName: \"kubernetes.io/projected/9e0d87d6-fd65-4565-971f-070050f2f9ff-kube-api-access-njvmh\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.353262 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e0d87d6-fd65-4565-971f-070050f2f9ff-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.353274 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dcb8e782-fe60-4c41-a843-1980fc8ab3cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.353283 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e0d87d6-fd65-4565-971f-070050f2f9ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.353291 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e0d87d6-fd65-4565-971f-070050f2f9ff-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:13 crc kubenswrapper[4848]: I0217 09:22:13.353300 4848 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/dcb8e782-fe60-4c41-a843-1980fc8ab3cc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.140874 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-jq6l8" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.140930 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-ptthx" Feb 17 09:22:14 crc kubenswrapper[4848]: E0217 09:22:14.246322 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="19facc80-e9df-42dc-8124-7619b2167b5c" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.355085 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7b6cdb54-tkxbl"] Feb 17 09:22:14 crc kubenswrapper[4848]: E0217 09:22:14.355518 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e0d87d6-fd65-4565-971f-070050f2f9ff" containerName="placement-db-sync" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.355541 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e0d87d6-fd65-4565-971f-070050f2f9ff" containerName="placement-db-sync" Feb 17 09:22:14 crc kubenswrapper[4848]: E0217 09:22:14.355555 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcb8e782-fe60-4c41-a843-1980fc8ab3cc" containerName="barbican-db-sync" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.355563 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcb8e782-fe60-4c41-a843-1980fc8ab3cc" containerName="barbican-db-sync" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.355794 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcb8e782-fe60-4c41-a843-1980fc8ab3cc" containerName="barbican-db-sync" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.355816 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e0d87d6-fd65-4565-971f-070050f2f9ff" containerName="placement-db-sync" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.357103 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.367920 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.368036 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.368159 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-7stp8" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.368360 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.368535 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.414406 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7b6cdb54-tkxbl"] Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.484795 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eba884ff-2e19-4dca-ba2e-75a8a311ea19-logs\") pod \"placement-7b6cdb54-tkxbl\" (UID: \"eba884ff-2e19-4dca-ba2e-75a8a311ea19\") " pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.484901 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnkf8\" (UniqueName: \"kubernetes.io/projected/eba884ff-2e19-4dca-ba2e-75a8a311ea19-kube-api-access-cnkf8\") pod \"placement-7b6cdb54-tkxbl\" (UID: \"eba884ff-2e19-4dca-ba2e-75a8a311ea19\") " pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.484925 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eba884ff-2e19-4dca-ba2e-75a8a311ea19-scripts\") pod \"placement-7b6cdb54-tkxbl\" (UID: \"eba884ff-2e19-4dca-ba2e-75a8a311ea19\") " pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.484944 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eba884ff-2e19-4dca-ba2e-75a8a311ea19-internal-tls-certs\") pod \"placement-7b6cdb54-tkxbl\" (UID: \"eba884ff-2e19-4dca-ba2e-75a8a311ea19\") " pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.484994 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eba884ff-2e19-4dca-ba2e-75a8a311ea19-public-tls-certs\") pod \"placement-7b6cdb54-tkxbl\" (UID: \"eba884ff-2e19-4dca-ba2e-75a8a311ea19\") " pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.485012 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eba884ff-2e19-4dca-ba2e-75a8a311ea19-config-data\") pod \"placement-7b6cdb54-tkxbl\" (UID: \"eba884ff-2e19-4dca-ba2e-75a8a311ea19\") " pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.485027 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eba884ff-2e19-4dca-ba2e-75a8a311ea19-combined-ca-bundle\") pod \"placement-7b6cdb54-tkxbl\" (UID: \"eba884ff-2e19-4dca-ba2e-75a8a311ea19\") " pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.485833 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-848f449699-2nhmn"] Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.487267 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-848f449699-2nhmn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.490169 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.490283 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.490849 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-nshcl" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.522474 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6c677d9df8-z5nnn"] Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.524029 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6c677d9df8-z5nnn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.532227 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.565331 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-848f449699-2nhmn"] Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.571631 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6c677d9df8-z5nnn"] Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.581900 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b55f48d49-zm6nm"] Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.583781 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b55f48d49-zm6nm" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.591726 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eba884ff-2e19-4dca-ba2e-75a8a311ea19-public-tls-certs\") pod \"placement-7b6cdb54-tkxbl\" (UID: \"eba884ff-2e19-4dca-ba2e-75a8a311ea19\") " pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.591787 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eba884ff-2e19-4dca-ba2e-75a8a311ea19-config-data\") pod \"placement-7b6cdb54-tkxbl\" (UID: \"eba884ff-2e19-4dca-ba2e-75a8a311ea19\") " pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.591809 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eba884ff-2e19-4dca-ba2e-75a8a311ea19-combined-ca-bundle\") pod \"placement-7b6cdb54-tkxbl\" (UID: \"eba884ff-2e19-4dca-ba2e-75a8a311ea19\") " pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.591830 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd92adeb-535d-4d36-a176-b5cd3ca667dc-config-data\") pod \"barbican-worker-848f449699-2nhmn\" (UID: \"dd92adeb-535d-4d36-a176-b5cd3ca667dc\") " pod="openstack/barbican-worker-848f449699-2nhmn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.591865 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd92adeb-535d-4d36-a176-b5cd3ca667dc-combined-ca-bundle\") pod \"barbican-worker-848f449699-2nhmn\" (UID: \"dd92adeb-535d-4d36-a176-b5cd3ca667dc\") " pod="openstack/barbican-worker-848f449699-2nhmn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.591914 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49b9137d-75ca-4b52-9338-6bf15270a667-logs\") pod \"barbican-keystone-listener-6c677d9df8-z5nnn\" (UID: \"49b9137d-75ca-4b52-9338-6bf15270a667\") " pod="openstack/barbican-keystone-listener-6c677d9df8-z5nnn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.592003 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd92adeb-535d-4d36-a176-b5cd3ca667dc-config-data-custom\") pod \"barbican-worker-848f449699-2nhmn\" (UID: \"dd92adeb-535d-4d36-a176-b5cd3ca667dc\") " pod="openstack/barbican-worker-848f449699-2nhmn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.592038 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b9137d-75ca-4b52-9338-6bf15270a667-combined-ca-bundle\") pod \"barbican-keystone-listener-6c677d9df8-z5nnn\" (UID: \"49b9137d-75ca-4b52-9338-6bf15270a667\") " pod="openstack/barbican-keystone-listener-6c677d9df8-z5nnn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.592056 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eba884ff-2e19-4dca-ba2e-75a8a311ea19-logs\") pod \"placement-7b6cdb54-tkxbl\" (UID: \"eba884ff-2e19-4dca-ba2e-75a8a311ea19\") " pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.592093 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqsxq\" (UniqueName: \"kubernetes.io/projected/49b9137d-75ca-4b52-9338-6bf15270a667-kube-api-access-lqsxq\") pod \"barbican-keystone-listener-6c677d9df8-z5nnn\" (UID: \"49b9137d-75ca-4b52-9338-6bf15270a667\") " pod="openstack/barbican-keystone-listener-6c677d9df8-z5nnn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.592122 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49b9137d-75ca-4b52-9338-6bf15270a667-config-data-custom\") pod \"barbican-keystone-listener-6c677d9df8-z5nnn\" (UID: \"49b9137d-75ca-4b52-9338-6bf15270a667\") " pod="openstack/barbican-keystone-listener-6c677d9df8-z5nnn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.592149 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b9137d-75ca-4b52-9338-6bf15270a667-config-data\") pod \"barbican-keystone-listener-6c677d9df8-z5nnn\" (UID: \"49b9137d-75ca-4b52-9338-6bf15270a667\") " pod="openstack/barbican-keystone-listener-6c677d9df8-z5nnn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.592174 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnkf8\" (UniqueName: \"kubernetes.io/projected/eba884ff-2e19-4dca-ba2e-75a8a311ea19-kube-api-access-cnkf8\") pod \"placement-7b6cdb54-tkxbl\" (UID: \"eba884ff-2e19-4dca-ba2e-75a8a311ea19\") " pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.592192 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhtkg\" (UniqueName: \"kubernetes.io/projected/dd92adeb-535d-4d36-a176-b5cd3ca667dc-kube-api-access-xhtkg\") pod \"barbican-worker-848f449699-2nhmn\" (UID: \"dd92adeb-535d-4d36-a176-b5cd3ca667dc\") " pod="openstack/barbican-worker-848f449699-2nhmn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.592214 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd92adeb-535d-4d36-a176-b5cd3ca667dc-logs\") pod \"barbican-worker-848f449699-2nhmn\" (UID: \"dd92adeb-535d-4d36-a176-b5cd3ca667dc\") " pod="openstack/barbican-worker-848f449699-2nhmn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.592236 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eba884ff-2e19-4dca-ba2e-75a8a311ea19-scripts\") pod \"placement-7b6cdb54-tkxbl\" (UID: \"eba884ff-2e19-4dca-ba2e-75a8a311ea19\") " pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.592254 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eba884ff-2e19-4dca-ba2e-75a8a311ea19-internal-tls-certs\") pod \"placement-7b6cdb54-tkxbl\" (UID: \"eba884ff-2e19-4dca-ba2e-75a8a311ea19\") " pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.602555 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eba884ff-2e19-4dca-ba2e-75a8a311ea19-logs\") pod \"placement-7b6cdb54-tkxbl\" (UID: \"eba884ff-2e19-4dca-ba2e-75a8a311ea19\") " pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.606721 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eba884ff-2e19-4dca-ba2e-75a8a311ea19-public-tls-certs\") pod \"placement-7b6cdb54-tkxbl\" (UID: \"eba884ff-2e19-4dca-ba2e-75a8a311ea19\") " pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.616617 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eba884ff-2e19-4dca-ba2e-75a8a311ea19-combined-ca-bundle\") pod \"placement-7b6cdb54-tkxbl\" (UID: \"eba884ff-2e19-4dca-ba2e-75a8a311ea19\") " pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.621272 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eba884ff-2e19-4dca-ba2e-75a8a311ea19-scripts\") pod \"placement-7b6cdb54-tkxbl\" (UID: \"eba884ff-2e19-4dca-ba2e-75a8a311ea19\") " pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.623792 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eba884ff-2e19-4dca-ba2e-75a8a311ea19-internal-tls-certs\") pod \"placement-7b6cdb54-tkxbl\" (UID: \"eba884ff-2e19-4dca-ba2e-75a8a311ea19\") " pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.629442 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eba884ff-2e19-4dca-ba2e-75a8a311ea19-config-data\") pod \"placement-7b6cdb54-tkxbl\" (UID: \"eba884ff-2e19-4dca-ba2e-75a8a311ea19\") " pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.631311 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnkf8\" (UniqueName: \"kubernetes.io/projected/eba884ff-2e19-4dca-ba2e-75a8a311ea19-kube-api-access-cnkf8\") pod \"placement-7b6cdb54-tkxbl\" (UID: \"eba884ff-2e19-4dca-ba2e-75a8a311ea19\") " pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.654828 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b55f48d49-zm6nm"] Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.693635 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd92adeb-535d-4d36-a176-b5cd3ca667dc-config-data-custom\") pod \"barbican-worker-848f449699-2nhmn\" (UID: \"dd92adeb-535d-4d36-a176-b5cd3ca667dc\") " pod="openstack/barbican-worker-848f449699-2nhmn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.693693 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk259\" (UniqueName: \"kubernetes.io/projected/f64da7f0-6afa-4c5f-913d-2127c033db81-kube-api-access-wk259\") pod \"dnsmasq-dns-6b55f48d49-zm6nm\" (UID: \"f64da7f0-6afa-4c5f-913d-2127c033db81\") " pod="openstack/dnsmasq-dns-6b55f48d49-zm6nm" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.693722 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-config\") pod \"dnsmasq-dns-6b55f48d49-zm6nm\" (UID: \"f64da7f0-6afa-4c5f-913d-2127c033db81\") " pod="openstack/dnsmasq-dns-6b55f48d49-zm6nm" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.693747 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b9137d-75ca-4b52-9338-6bf15270a667-combined-ca-bundle\") pod \"barbican-keystone-listener-6c677d9df8-z5nnn\" (UID: \"49b9137d-75ca-4b52-9338-6bf15270a667\") " pod="openstack/barbican-keystone-listener-6c677d9df8-z5nnn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.693791 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-dns-swift-storage-0\") pod \"dnsmasq-dns-6b55f48d49-zm6nm\" (UID: \"f64da7f0-6afa-4c5f-913d-2127c033db81\") " pod="openstack/dnsmasq-dns-6b55f48d49-zm6nm" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.693822 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqsxq\" (UniqueName: \"kubernetes.io/projected/49b9137d-75ca-4b52-9338-6bf15270a667-kube-api-access-lqsxq\") pod \"barbican-keystone-listener-6c677d9df8-z5nnn\" (UID: \"49b9137d-75ca-4b52-9338-6bf15270a667\") " pod="openstack/barbican-keystone-listener-6c677d9df8-z5nnn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.693848 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-ovsdbserver-sb\") pod \"dnsmasq-dns-6b55f48d49-zm6nm\" (UID: \"f64da7f0-6afa-4c5f-913d-2127c033db81\") " pod="openstack/dnsmasq-dns-6b55f48d49-zm6nm" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.693870 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49b9137d-75ca-4b52-9338-6bf15270a667-config-data-custom\") pod \"barbican-keystone-listener-6c677d9df8-z5nnn\" (UID: \"49b9137d-75ca-4b52-9338-6bf15270a667\") " pod="openstack/barbican-keystone-listener-6c677d9df8-z5nnn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.693900 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b9137d-75ca-4b52-9338-6bf15270a667-config-data\") pod \"barbican-keystone-listener-6c677d9df8-z5nnn\" (UID: \"49b9137d-75ca-4b52-9338-6bf15270a667\") " pod="openstack/barbican-keystone-listener-6c677d9df8-z5nnn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.693933 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhtkg\" (UniqueName: \"kubernetes.io/projected/dd92adeb-535d-4d36-a176-b5cd3ca667dc-kube-api-access-xhtkg\") pod \"barbican-worker-848f449699-2nhmn\" (UID: \"dd92adeb-535d-4d36-a176-b5cd3ca667dc\") " pod="openstack/barbican-worker-848f449699-2nhmn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.693958 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd92adeb-535d-4d36-a176-b5cd3ca667dc-logs\") pod \"barbican-worker-848f449699-2nhmn\" (UID: \"dd92adeb-535d-4d36-a176-b5cd3ca667dc\") " pod="openstack/barbican-worker-848f449699-2nhmn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.694052 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd92adeb-535d-4d36-a176-b5cd3ca667dc-config-data\") pod \"barbican-worker-848f449699-2nhmn\" (UID: \"dd92adeb-535d-4d36-a176-b5cd3ca667dc\") " pod="openstack/barbican-worker-848f449699-2nhmn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.694086 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd92adeb-535d-4d36-a176-b5cd3ca667dc-combined-ca-bundle\") pod \"barbican-worker-848f449699-2nhmn\" (UID: \"dd92adeb-535d-4d36-a176-b5cd3ca667dc\") " pod="openstack/barbican-worker-848f449699-2nhmn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.694117 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-dns-svc\") pod \"dnsmasq-dns-6b55f48d49-zm6nm\" (UID: \"f64da7f0-6afa-4c5f-913d-2127c033db81\") " pod="openstack/dnsmasq-dns-6b55f48d49-zm6nm" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.694156 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49b9137d-75ca-4b52-9338-6bf15270a667-logs\") pod \"barbican-keystone-listener-6c677d9df8-z5nnn\" (UID: \"49b9137d-75ca-4b52-9338-6bf15270a667\") " pod="openstack/barbican-keystone-listener-6c677d9df8-z5nnn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.694189 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-ovsdbserver-nb\") pod \"dnsmasq-dns-6b55f48d49-zm6nm\" (UID: \"f64da7f0-6afa-4c5f-913d-2127c033db81\") " pod="openstack/dnsmasq-dns-6b55f48d49-zm6nm" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.697298 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dd92adeb-535d-4d36-a176-b5cd3ca667dc-logs\") pod \"barbican-worker-848f449699-2nhmn\" (UID: \"dd92adeb-535d-4d36-a176-b5cd3ca667dc\") " pod="openstack/barbican-worker-848f449699-2nhmn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.697819 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49b9137d-75ca-4b52-9338-6bf15270a667-logs\") pod \"barbican-keystone-listener-6c677d9df8-z5nnn\" (UID: \"49b9137d-75ca-4b52-9338-6bf15270a667\") " pod="openstack/barbican-keystone-listener-6c677d9df8-z5nnn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.701396 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dd92adeb-535d-4d36-a176-b5cd3ca667dc-config-data-custom\") pod \"barbican-worker-848f449699-2nhmn\" (UID: \"dd92adeb-535d-4d36-a176-b5cd3ca667dc\") " pod="openstack/barbican-worker-848f449699-2nhmn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.702058 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd92adeb-535d-4d36-a176-b5cd3ca667dc-combined-ca-bundle\") pod \"barbican-worker-848f449699-2nhmn\" (UID: \"dd92adeb-535d-4d36-a176-b5cd3ca667dc\") " pod="openstack/barbican-worker-848f449699-2nhmn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.703832 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-65667fcd94-ngsp8"] Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.705236 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-65667fcd94-ngsp8" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.722650 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-65667fcd94-ngsp8"] Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.733113 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.733519 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49b9137d-75ca-4b52-9338-6bf15270a667-combined-ca-bundle\") pod \"barbican-keystone-listener-6c677d9df8-z5nnn\" (UID: \"49b9137d-75ca-4b52-9338-6bf15270a667\") " pod="openstack/barbican-keystone-listener-6c677d9df8-z5nnn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.733990 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49b9137d-75ca-4b52-9338-6bf15270a667-config-data-custom\") pod \"barbican-keystone-listener-6c677d9df8-z5nnn\" (UID: \"49b9137d-75ca-4b52-9338-6bf15270a667\") " pod="openstack/barbican-keystone-listener-6c677d9df8-z5nnn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.734095 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49b9137d-75ca-4b52-9338-6bf15270a667-config-data\") pod \"barbican-keystone-listener-6c677d9df8-z5nnn\" (UID: \"49b9137d-75ca-4b52-9338-6bf15270a667\") " pod="openstack/barbican-keystone-listener-6c677d9df8-z5nnn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.735976 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.739548 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd92adeb-535d-4d36-a176-b5cd3ca667dc-config-data\") pod \"barbican-worker-848f449699-2nhmn\" (UID: \"dd92adeb-535d-4d36-a176-b5cd3ca667dc\") " pod="openstack/barbican-worker-848f449699-2nhmn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.744015 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhtkg\" (UniqueName: \"kubernetes.io/projected/dd92adeb-535d-4d36-a176-b5cd3ca667dc-kube-api-access-xhtkg\") pod \"barbican-worker-848f449699-2nhmn\" (UID: \"dd92adeb-535d-4d36-a176-b5cd3ca667dc\") " pod="openstack/barbican-worker-848f449699-2nhmn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.757658 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqsxq\" (UniqueName: \"kubernetes.io/projected/49b9137d-75ca-4b52-9338-6bf15270a667-kube-api-access-lqsxq\") pod \"barbican-keystone-listener-6c677d9df8-z5nnn\" (UID: \"49b9137d-75ca-4b52-9338-6bf15270a667\") " pod="openstack/barbican-keystone-listener-6c677d9df8-z5nnn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.782051 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lh5dl" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.795308 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-ovsdbserver-sb\") pod \"dnsmasq-dns-6b55f48d49-zm6nm\" (UID: \"f64da7f0-6afa-4c5f-913d-2127c033db81\") " pod="openstack/dnsmasq-dns-6b55f48d49-zm6nm" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.795415 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-dns-svc\") pod \"dnsmasq-dns-6b55f48d49-zm6nm\" (UID: \"f64da7f0-6afa-4c5f-913d-2127c033db81\") " pod="openstack/dnsmasq-dns-6b55f48d49-zm6nm" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.795443 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fbd606b-59f6-4ba4-8e62-d90235b987d4-logs\") pod \"barbican-api-65667fcd94-ngsp8\" (UID: \"5fbd606b-59f6-4ba4-8e62-d90235b987d4\") " pod="openstack/barbican-api-65667fcd94-ngsp8" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.795469 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fbd606b-59f6-4ba4-8e62-d90235b987d4-combined-ca-bundle\") pod \"barbican-api-65667fcd94-ngsp8\" (UID: \"5fbd606b-59f6-4ba4-8e62-d90235b987d4\") " pod="openstack/barbican-api-65667fcd94-ngsp8" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.795487 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncs58\" (UniqueName: \"kubernetes.io/projected/5fbd606b-59f6-4ba4-8e62-d90235b987d4-kube-api-access-ncs58\") pod \"barbican-api-65667fcd94-ngsp8\" (UID: \"5fbd606b-59f6-4ba4-8e62-d90235b987d4\") " pod="openstack/barbican-api-65667fcd94-ngsp8" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.795696 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-ovsdbserver-nb\") pod \"dnsmasq-dns-6b55f48d49-zm6nm\" (UID: \"f64da7f0-6afa-4c5f-913d-2127c033db81\") " pod="openstack/dnsmasq-dns-6b55f48d49-zm6nm" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.795745 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk259\" (UniqueName: \"kubernetes.io/projected/f64da7f0-6afa-4c5f-913d-2127c033db81-kube-api-access-wk259\") pod \"dnsmasq-dns-6b55f48d49-zm6nm\" (UID: \"f64da7f0-6afa-4c5f-913d-2127c033db81\") " pod="openstack/dnsmasq-dns-6b55f48d49-zm6nm" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.795778 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5fbd606b-59f6-4ba4-8e62-d90235b987d4-config-data-custom\") pod \"barbican-api-65667fcd94-ngsp8\" (UID: \"5fbd606b-59f6-4ba4-8e62-d90235b987d4\") " pod="openstack/barbican-api-65667fcd94-ngsp8" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.795800 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-config\") pod \"dnsmasq-dns-6b55f48d49-zm6nm\" (UID: \"f64da7f0-6afa-4c5f-913d-2127c033db81\") " pod="openstack/dnsmasq-dns-6b55f48d49-zm6nm" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.795823 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fbd606b-59f6-4ba4-8e62-d90235b987d4-config-data\") pod \"barbican-api-65667fcd94-ngsp8\" (UID: \"5fbd606b-59f6-4ba4-8e62-d90235b987d4\") " pod="openstack/barbican-api-65667fcd94-ngsp8" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.795844 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-dns-swift-storage-0\") pod \"dnsmasq-dns-6b55f48d49-zm6nm\" (UID: \"f64da7f0-6afa-4c5f-913d-2127c033db81\") " pod="openstack/dnsmasq-dns-6b55f48d49-zm6nm" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.796742 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-dns-swift-storage-0\") pod \"dnsmasq-dns-6b55f48d49-zm6nm\" (UID: \"f64da7f0-6afa-4c5f-913d-2127c033db81\") " pod="openstack/dnsmasq-dns-6b55f48d49-zm6nm" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.797084 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-ovsdbserver-sb\") pod \"dnsmasq-dns-6b55f48d49-zm6nm\" (UID: \"f64da7f0-6afa-4c5f-913d-2127c033db81\") " pod="openstack/dnsmasq-dns-6b55f48d49-zm6nm" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.802540 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-ovsdbserver-nb\") pod \"dnsmasq-dns-6b55f48d49-zm6nm\" (UID: \"f64da7f0-6afa-4c5f-913d-2127c033db81\") " pod="openstack/dnsmasq-dns-6b55f48d49-zm6nm" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.802958 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-dns-svc\") pod \"dnsmasq-dns-6b55f48d49-zm6nm\" (UID: \"f64da7f0-6afa-4c5f-913d-2127c033db81\") " pod="openstack/dnsmasq-dns-6b55f48d49-zm6nm" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.814553 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-config\") pod \"dnsmasq-dns-6b55f48d49-zm6nm\" (UID: \"f64da7f0-6afa-4c5f-913d-2127c033db81\") " pod="openstack/dnsmasq-dns-6b55f48d49-zm6nm" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.816371 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk259\" (UniqueName: \"kubernetes.io/projected/f64da7f0-6afa-4c5f-913d-2127c033db81-kube-api-access-wk259\") pod \"dnsmasq-dns-6b55f48d49-zm6nm\" (UID: \"f64da7f0-6afa-4c5f-913d-2127c033db81\") " pod="openstack/dnsmasq-dns-6b55f48d49-zm6nm" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.823167 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-848f449699-2nhmn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.855139 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6c677d9df8-z5nnn" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.897626 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba8c67b5-216a-4f60-baee-c1b6211f89ec-combined-ca-bundle\") pod \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\" (UID: \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\") " Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.897725 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ba8c67b5-216a-4f60-baee-c1b6211f89ec-db-sync-config-data\") pod \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\" (UID: \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\") " Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.897778 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba8c67b5-216a-4f60-baee-c1b6211f89ec-config-data\") pod \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\" (UID: \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\") " Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.897848 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba8c67b5-216a-4f60-baee-c1b6211f89ec-etc-machine-id\") pod \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\" (UID: \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\") " Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.897876 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rr7h8\" (UniqueName: \"kubernetes.io/projected/ba8c67b5-216a-4f60-baee-c1b6211f89ec-kube-api-access-rr7h8\") pod \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\" (UID: \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\") " Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.897908 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba8c67b5-216a-4f60-baee-c1b6211f89ec-scripts\") pod \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\" (UID: \"ba8c67b5-216a-4f60-baee-c1b6211f89ec\") " Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.898115 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fbd606b-59f6-4ba4-8e62-d90235b987d4-logs\") pod \"barbican-api-65667fcd94-ngsp8\" (UID: \"5fbd606b-59f6-4ba4-8e62-d90235b987d4\") " pod="openstack/barbican-api-65667fcd94-ngsp8" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.898142 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fbd606b-59f6-4ba4-8e62-d90235b987d4-combined-ca-bundle\") pod \"barbican-api-65667fcd94-ngsp8\" (UID: \"5fbd606b-59f6-4ba4-8e62-d90235b987d4\") " pod="openstack/barbican-api-65667fcd94-ngsp8" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.898157 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncs58\" (UniqueName: \"kubernetes.io/projected/5fbd606b-59f6-4ba4-8e62-d90235b987d4-kube-api-access-ncs58\") pod \"barbican-api-65667fcd94-ngsp8\" (UID: \"5fbd606b-59f6-4ba4-8e62-d90235b987d4\") " pod="openstack/barbican-api-65667fcd94-ngsp8" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.898209 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5fbd606b-59f6-4ba4-8e62-d90235b987d4-config-data-custom\") pod \"barbican-api-65667fcd94-ngsp8\" (UID: \"5fbd606b-59f6-4ba4-8e62-d90235b987d4\") " pod="openstack/barbican-api-65667fcd94-ngsp8" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.898244 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fbd606b-59f6-4ba4-8e62-d90235b987d4-config-data\") pod \"barbican-api-65667fcd94-ngsp8\" (UID: \"5fbd606b-59f6-4ba4-8e62-d90235b987d4\") " pod="openstack/barbican-api-65667fcd94-ngsp8" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.903799 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fbd606b-59f6-4ba4-8e62-d90235b987d4-config-data\") pod \"barbican-api-65667fcd94-ngsp8\" (UID: \"5fbd606b-59f6-4ba4-8e62-d90235b987d4\") " pod="openstack/barbican-api-65667fcd94-ngsp8" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.906876 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ba8c67b5-216a-4f60-baee-c1b6211f89ec-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "ba8c67b5-216a-4f60-baee-c1b6211f89ec" (UID: "ba8c67b5-216a-4f60-baee-c1b6211f89ec"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.912103 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba8c67b5-216a-4f60-baee-c1b6211f89ec-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "ba8c67b5-216a-4f60-baee-c1b6211f89ec" (UID: "ba8c67b5-216a-4f60-baee-c1b6211f89ec"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.912396 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fbd606b-59f6-4ba4-8e62-d90235b987d4-logs\") pod \"barbican-api-65667fcd94-ngsp8\" (UID: \"5fbd606b-59f6-4ba4-8e62-d90235b987d4\") " pod="openstack/barbican-api-65667fcd94-ngsp8" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.914870 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba8c67b5-216a-4f60-baee-c1b6211f89ec-scripts" (OuterVolumeSpecName: "scripts") pod "ba8c67b5-216a-4f60-baee-c1b6211f89ec" (UID: "ba8c67b5-216a-4f60-baee-c1b6211f89ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.935898 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba8c67b5-216a-4f60-baee-c1b6211f89ec-kube-api-access-rr7h8" (OuterVolumeSpecName: "kube-api-access-rr7h8") pod "ba8c67b5-216a-4f60-baee-c1b6211f89ec" (UID: "ba8c67b5-216a-4f60-baee-c1b6211f89ec"). InnerVolumeSpecName "kube-api-access-rr7h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.938281 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fbd606b-59f6-4ba4-8e62-d90235b987d4-combined-ca-bundle\") pod \"barbican-api-65667fcd94-ngsp8\" (UID: \"5fbd606b-59f6-4ba4-8e62-d90235b987d4\") " pod="openstack/barbican-api-65667fcd94-ngsp8" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.952575 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5fbd606b-59f6-4ba4-8e62-d90235b987d4-config-data-custom\") pod \"barbican-api-65667fcd94-ngsp8\" (UID: \"5fbd606b-59f6-4ba4-8e62-d90235b987d4\") " pod="openstack/barbican-api-65667fcd94-ngsp8" Feb 17 09:22:14 crc kubenswrapper[4848]: I0217 09:22:14.968351 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncs58\" (UniqueName: \"kubernetes.io/projected/5fbd606b-59f6-4ba4-8e62-d90235b987d4-kube-api-access-ncs58\") pod \"barbican-api-65667fcd94-ngsp8\" (UID: \"5fbd606b-59f6-4ba4-8e62-d90235b987d4\") " pod="openstack/barbican-api-65667fcd94-ngsp8" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.023503 4848 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/ba8c67b5-216a-4f60-baee-c1b6211f89ec-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.023832 4848 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/ba8c67b5-216a-4f60-baee-c1b6211f89ec-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.023851 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rr7h8\" (UniqueName: \"kubernetes.io/projected/ba8c67b5-216a-4f60-baee-c1b6211f89ec-kube-api-access-rr7h8\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.023862 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba8c67b5-216a-4f60-baee-c1b6211f89ec-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.059593 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b55f48d49-zm6nm" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.072808 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba8c67b5-216a-4f60-baee-c1b6211f89ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba8c67b5-216a-4f60-baee-c1b6211f89ec" (UID: "ba8c67b5-216a-4f60-baee-c1b6211f89ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.089803 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-65667fcd94-ngsp8" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.091901 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba8c67b5-216a-4f60-baee-c1b6211f89ec-config-data" (OuterVolumeSpecName: "config-data") pod "ba8c67b5-216a-4f60-baee-c1b6211f89ec" (UID: "ba8c67b5-216a-4f60-baee-c1b6211f89ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.125977 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba8c67b5-216a-4f60-baee-c1b6211f89ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.126017 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba8c67b5-216a-4f60-baee-c1b6211f89ec-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.262682 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19facc80-e9df-42dc-8124-7619b2167b5c","Type":"ContainerStarted","Data":"007d29b079b19764f9d7719f3e34df2f08cfcf6b91488bdcdc51c036c2ca9bc1"} Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.262888 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="19facc80-e9df-42dc-8124-7619b2167b5c" containerName="ceilometer-notification-agent" containerID="cri-o://d694ea51f368836cbddab6d27bab1acb6fcbace326a0b7a7bd157af6be65915f" gracePeriod=30 Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.263153 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.263391 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="19facc80-e9df-42dc-8124-7619b2167b5c" containerName="proxy-httpd" containerID="cri-o://007d29b079b19764f9d7719f3e34df2f08cfcf6b91488bdcdc51c036c2ca9bc1" gracePeriod=30 Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.263432 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="19facc80-e9df-42dc-8124-7619b2167b5c" containerName="sg-core" containerID="cri-o://41572746e34d178fd02daf01a2fe3cb41b45a45e18e1670540f5d2845cab6370" gracePeriod=30 Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.277062 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-lh5dl" event={"ID":"ba8c67b5-216a-4f60-baee-c1b6211f89ec","Type":"ContainerDied","Data":"9ffb7dcd3c99ade820dfc37ea257ff6575cd1dc4c0bd6ae46857933d256e0926"} Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.277211 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ffb7dcd3c99ade820dfc37ea257ff6575cd1dc4c0bd6ae46857933d256e0926" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.277342 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-lh5dl" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.512326 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 09:22:15 crc kubenswrapper[4848]: E0217 09:22:15.517228 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba8c67b5-216a-4f60-baee-c1b6211f89ec" containerName="cinder-db-sync" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.528672 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba8c67b5-216a-4f60-baee-c1b6211f89ec" containerName="cinder-db-sync" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.529407 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba8c67b5-216a-4f60-baee-c1b6211f89ec" containerName="cinder-db-sync" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.530476 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.536400 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.536636 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.536780 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-d9d6p" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.536900 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.560284 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.572569 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b55f48d49-zm6nm"] Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.581200 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7b6cdb54-tkxbl"] Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.627487 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6dc67df487-8jdgt"] Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.629066 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.647834 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9a2cc0-eea8-401d-85db-7824e2ba0463-config-data\") pod \"cinder-scheduler-0\" (UID: \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.647890 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d9a2cc0-eea8-401d-85db-7824e2ba0463-scripts\") pod \"cinder-scheduler-0\" (UID: \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.647913 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d9a2cc0-eea8-401d-85db-7824e2ba0463-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.647968 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d9a2cc0-eea8-401d-85db-7824e2ba0463-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.647997 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqshj\" (UniqueName: \"kubernetes.io/projected/6d9a2cc0-eea8-401d-85db-7824e2ba0463-kube-api-access-sqshj\") pod \"cinder-scheduler-0\" (UID: \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.648018 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9a2cc0-eea8-401d-85db-7824e2ba0463-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.648127 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dc67df487-8jdgt"] Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.749728 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d9a2cc0-eea8-401d-85db-7824e2ba0463-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.749794 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqshj\" (UniqueName: \"kubernetes.io/projected/6d9a2cc0-eea8-401d-85db-7824e2ba0463-kube-api-access-sqshj\") pod \"cinder-scheduler-0\" (UID: \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.749816 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9a2cc0-eea8-401d-85db-7824e2ba0463-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.749860 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thk8g\" (UniqueName: \"kubernetes.io/projected/78406f6e-a154-4a75-96f6-99f77e092176-kube-api-access-thk8g\") pod \"dnsmasq-dns-6dc67df487-8jdgt\" (UID: \"78406f6e-a154-4a75-96f6-99f77e092176\") " pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.749884 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc67df487-8jdgt\" (UID: \"78406f6e-a154-4a75-96f6-99f77e092176\") " pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.749901 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-dns-swift-storage-0\") pod \"dnsmasq-dns-6dc67df487-8jdgt\" (UID: \"78406f6e-a154-4a75-96f6-99f77e092176\") " pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.749922 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-config\") pod \"dnsmasq-dns-6dc67df487-8jdgt\" (UID: \"78406f6e-a154-4a75-96f6-99f77e092176\") " pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.749962 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc67df487-8jdgt\" (UID: \"78406f6e-a154-4a75-96f6-99f77e092176\") " pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.749988 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-dns-svc\") pod \"dnsmasq-dns-6dc67df487-8jdgt\" (UID: \"78406f6e-a154-4a75-96f6-99f77e092176\") " pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.750007 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9a2cc0-eea8-401d-85db-7824e2ba0463-config-data\") pod \"cinder-scheduler-0\" (UID: \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.750037 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d9a2cc0-eea8-401d-85db-7824e2ba0463-scripts\") pod \"cinder-scheduler-0\" (UID: \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.750058 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d9a2cc0-eea8-401d-85db-7824e2ba0463-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.750145 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d9a2cc0-eea8-401d-85db-7824e2ba0463-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.752823 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6c677d9df8-z5nnn"] Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.760124 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9a2cc0-eea8-401d-85db-7824e2ba0463-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.760880 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d9a2cc0-eea8-401d-85db-7824e2ba0463-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.764377 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.765493 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d9a2cc0-eea8-401d-85db-7824e2ba0463-scripts\") pod \"cinder-scheduler-0\" (UID: \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.768031 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9a2cc0-eea8-401d-85db-7824e2ba0463-config-data\") pod \"cinder-scheduler-0\" (UID: \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.768435 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.775001 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.778556 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.785860 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqshj\" (UniqueName: \"kubernetes.io/projected/6d9a2cc0-eea8-401d-85db-7824e2ba0463-kube-api-access-sqshj\") pod \"cinder-scheduler-0\" (UID: \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.852189 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-dns-svc\") pod \"dnsmasq-dns-6dc67df487-8jdgt\" (UID: \"78406f6e-a154-4a75-96f6-99f77e092176\") " pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.852463 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dce8fa9b-bcca-459f-8483-60ed22b3e383-config-data-custom\") pod \"cinder-api-0\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " pod="openstack/cinder-api-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.852555 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dce8fa9b-bcca-459f-8483-60ed22b3e383-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " pod="openstack/cinder-api-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.852645 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dce8fa9b-bcca-459f-8483-60ed22b3e383-scripts\") pod \"cinder-api-0\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " pod="openstack/cinder-api-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.852811 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dce8fa9b-bcca-459f-8483-60ed22b3e383-logs\") pod \"cinder-api-0\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " pod="openstack/cinder-api-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.852899 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dce8fa9b-bcca-459f-8483-60ed22b3e383-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " pod="openstack/cinder-api-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.853007 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thk8g\" (UniqueName: \"kubernetes.io/projected/78406f6e-a154-4a75-96f6-99f77e092176-kube-api-access-thk8g\") pod \"dnsmasq-dns-6dc67df487-8jdgt\" (UID: \"78406f6e-a154-4a75-96f6-99f77e092176\") " pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.853104 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc67df487-8jdgt\" (UID: \"78406f6e-a154-4a75-96f6-99f77e092176\") " pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.853194 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-dns-swift-storage-0\") pod \"dnsmasq-dns-6dc67df487-8jdgt\" (UID: \"78406f6e-a154-4a75-96f6-99f77e092176\") " pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.853284 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-config\") pod \"dnsmasq-dns-6dc67df487-8jdgt\" (UID: \"78406f6e-a154-4a75-96f6-99f77e092176\") " pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.853380 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dce8fa9b-bcca-459f-8483-60ed22b3e383-config-data\") pod \"cinder-api-0\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " pod="openstack/cinder-api-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.853502 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc67df487-8jdgt\" (UID: \"78406f6e-a154-4a75-96f6-99f77e092176\") " pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.853600 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnvhg\" (UniqueName: \"kubernetes.io/projected/dce8fa9b-bcca-459f-8483-60ed22b3e383-kube-api-access-tnvhg\") pod \"cinder-api-0\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " pod="openstack/cinder-api-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.853901 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-dns-svc\") pod \"dnsmasq-dns-6dc67df487-8jdgt\" (UID: \"78406f6e-a154-4a75-96f6-99f77e092176\") " pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.854447 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-config\") pod \"dnsmasq-dns-6dc67df487-8jdgt\" (UID: \"78406f6e-a154-4a75-96f6-99f77e092176\") " pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.855199 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-ovsdbserver-nb\") pod \"dnsmasq-dns-6dc67df487-8jdgt\" (UID: \"78406f6e-a154-4a75-96f6-99f77e092176\") " pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.855218 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-dns-swift-storage-0\") pod \"dnsmasq-dns-6dc67df487-8jdgt\" (UID: \"78406f6e-a154-4a75-96f6-99f77e092176\") " pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.858450 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-ovsdbserver-sb\") pod \"dnsmasq-dns-6dc67df487-8jdgt\" (UID: \"78406f6e-a154-4a75-96f6-99f77e092176\") " pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.895555 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thk8g\" (UniqueName: \"kubernetes.io/projected/78406f6e-a154-4a75-96f6-99f77e092176-kube-api-access-thk8g\") pod \"dnsmasq-dns-6dc67df487-8jdgt\" (UID: \"78406f6e-a154-4a75-96f6-99f77e092176\") " pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.904579 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.949485 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.955968 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnvhg\" (UniqueName: \"kubernetes.io/projected/dce8fa9b-bcca-459f-8483-60ed22b3e383-kube-api-access-tnvhg\") pod \"cinder-api-0\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " pod="openstack/cinder-api-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.956033 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dce8fa9b-bcca-459f-8483-60ed22b3e383-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " pod="openstack/cinder-api-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.956049 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dce8fa9b-bcca-459f-8483-60ed22b3e383-config-data-custom\") pod \"cinder-api-0\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " pod="openstack/cinder-api-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.956071 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dce8fa9b-bcca-459f-8483-60ed22b3e383-scripts\") pod \"cinder-api-0\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " pod="openstack/cinder-api-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.956138 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dce8fa9b-bcca-459f-8483-60ed22b3e383-logs\") pod \"cinder-api-0\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " pod="openstack/cinder-api-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.956159 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dce8fa9b-bcca-459f-8483-60ed22b3e383-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " pod="openstack/cinder-api-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.956208 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dce8fa9b-bcca-459f-8483-60ed22b3e383-config-data\") pod \"cinder-api-0\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " pod="openstack/cinder-api-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.956353 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-848f449699-2nhmn"] Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.956449 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dce8fa9b-bcca-459f-8483-60ed22b3e383-etc-machine-id\") pod \"cinder-api-0\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " pod="openstack/cinder-api-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.966261 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dce8fa9b-bcca-459f-8483-60ed22b3e383-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " pod="openstack/cinder-api-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.967372 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dce8fa9b-bcca-459f-8483-60ed22b3e383-scripts\") pod \"cinder-api-0\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " pod="openstack/cinder-api-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.967674 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dce8fa9b-bcca-459f-8483-60ed22b3e383-config-data\") pod \"cinder-api-0\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " pod="openstack/cinder-api-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.968915 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dce8fa9b-bcca-459f-8483-60ed22b3e383-logs\") pod \"cinder-api-0\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " pod="openstack/cinder-api-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.969481 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dce8fa9b-bcca-459f-8483-60ed22b3e383-config-data-custom\") pod \"cinder-api-0\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " pod="openstack/cinder-api-0" Feb 17 09:22:15 crc kubenswrapper[4848]: I0217 09:22:15.994392 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnvhg\" (UniqueName: \"kubernetes.io/projected/dce8fa9b-bcca-459f-8483-60ed22b3e383-kube-api-access-tnvhg\") pod \"cinder-api-0\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " pod="openstack/cinder-api-0" Feb 17 09:22:16 crc kubenswrapper[4848]: I0217 09:22:16.106923 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b55f48d49-zm6nm"] Feb 17 09:22:16 crc kubenswrapper[4848]: I0217 09:22:16.112512 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 09:22:16 crc kubenswrapper[4848]: I0217 09:22:16.131094 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-65667fcd94-ngsp8"] Feb 17 09:22:16 crc kubenswrapper[4848]: W0217 09:22:16.165575 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fbd606b_59f6_4ba4_8e62_d90235b987d4.slice/crio-379431ab4949f910acbf9a00dc4eb17b6386de3783ab8f2d248a89617877e8bc WatchSource:0}: Error finding container 379431ab4949f910acbf9a00dc4eb17b6386de3783ab8f2d248a89617877e8bc: Status 404 returned error can't find the container with id 379431ab4949f910acbf9a00dc4eb17b6386de3783ab8f2d248a89617877e8bc Feb 17 09:22:16 crc kubenswrapper[4848]: I0217 09:22:16.368593 4848 generic.go:334] "Generic (PLEG): container finished" podID="19facc80-e9df-42dc-8124-7619b2167b5c" containerID="007d29b079b19764f9d7719f3e34df2f08cfcf6b91488bdcdc51c036c2ca9bc1" exitCode=0 Feb 17 09:22:16 crc kubenswrapper[4848]: I0217 09:22:16.369242 4848 generic.go:334] "Generic (PLEG): container finished" podID="19facc80-e9df-42dc-8124-7619b2167b5c" containerID="41572746e34d178fd02daf01a2fe3cb41b45a45e18e1670540f5d2845cab6370" exitCode=2 Feb 17 09:22:16 crc kubenswrapper[4848]: I0217 09:22:16.370386 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19facc80-e9df-42dc-8124-7619b2167b5c","Type":"ContainerDied","Data":"007d29b079b19764f9d7719f3e34df2f08cfcf6b91488bdcdc51c036c2ca9bc1"} Feb 17 09:22:16 crc kubenswrapper[4848]: I0217 09:22:16.370422 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19facc80-e9df-42dc-8124-7619b2167b5c","Type":"ContainerDied","Data":"41572746e34d178fd02daf01a2fe3cb41b45a45e18e1670540f5d2845cab6370"} Feb 17 09:22:16 crc kubenswrapper[4848]: I0217 09:22:16.380558 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b55f48d49-zm6nm" event={"ID":"f64da7f0-6afa-4c5f-913d-2127c033db81","Type":"ContainerStarted","Data":"3023274327d14f126d0bf4fef911c1bbde193d9855a9f09d8839bf17bda08958"} Feb 17 09:22:16 crc kubenswrapper[4848]: I0217 09:22:16.381574 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-848f449699-2nhmn" event={"ID":"dd92adeb-535d-4d36-a176-b5cd3ca667dc","Type":"ContainerStarted","Data":"db5c67bb729acb46294125e7533fa279fe6f25eaf2758416cf9393aafb1dbc08"} Feb 17 09:22:16 crc kubenswrapper[4848]: I0217 09:22:16.411044 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c677d9df8-z5nnn" event={"ID":"49b9137d-75ca-4b52-9338-6bf15270a667","Type":"ContainerStarted","Data":"153243dc8c822ca9cb000a84fac6e86d387dd16a33de18fd788583c08a81bc49"} Feb 17 09:22:16 crc kubenswrapper[4848]: I0217 09:22:16.431145 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b6cdb54-tkxbl" event={"ID":"eba884ff-2e19-4dca-ba2e-75a8a311ea19","Type":"ContainerStarted","Data":"c90aa1d8fe143481592b8eb56fa976029dc29ed78cbd11bb8cad1d705a808502"} Feb 17 09:22:16 crc kubenswrapper[4848]: I0217 09:22:16.431190 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b6cdb54-tkxbl" event={"ID":"eba884ff-2e19-4dca-ba2e-75a8a311ea19","Type":"ContainerStarted","Data":"53dc9317a004cbeec97c4e3cb299d55a3a972801aa307050ecbb56dafe0209d3"} Feb 17 09:22:16 crc kubenswrapper[4848]: I0217 09:22:16.438530 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65667fcd94-ngsp8" event={"ID":"5fbd606b-59f6-4ba4-8e62-d90235b987d4","Type":"ContainerStarted","Data":"379431ab4949f910acbf9a00dc4eb17b6386de3783ab8f2d248a89617877e8bc"} Feb 17 09:22:16 crc kubenswrapper[4848]: I0217 09:22:16.610063 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 09:22:16 crc kubenswrapper[4848]: I0217 09:22:16.624407 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6dc67df487-8jdgt"] Feb 17 09:22:16 crc kubenswrapper[4848]: I0217 09:22:16.832608 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 09:22:17 crc kubenswrapper[4848]: I0217 09:22:17.460272 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65667fcd94-ngsp8" event={"ID":"5fbd606b-59f6-4ba4-8e62-d90235b987d4","Type":"ContainerStarted","Data":"c3ecd8825a36de61f635e02b70fe33c7f6d63d35a1006ae84297ca4996c669b3"} Feb 17 09:22:17 crc kubenswrapper[4848]: I0217 09:22:17.460562 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65667fcd94-ngsp8" event={"ID":"5fbd606b-59f6-4ba4-8e62-d90235b987d4","Type":"ContainerStarted","Data":"2abb4270db7b9df48b7fe1880e79f71943d782d3801ff1a452c0f6b93041ab2e"} Feb 17 09:22:17 crc kubenswrapper[4848]: I0217 09:22:17.461737 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-65667fcd94-ngsp8" Feb 17 09:22:17 crc kubenswrapper[4848]: I0217 09:22:17.461793 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-65667fcd94-ngsp8" Feb 17 09:22:17 crc kubenswrapper[4848]: I0217 09:22:17.471187 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6d9a2cc0-eea8-401d-85db-7824e2ba0463","Type":"ContainerStarted","Data":"55a876e45cb588c818891f03bde27abd76bd4925fba7772b6fba7107bde76f17"} Feb 17 09:22:17 crc kubenswrapper[4848]: I0217 09:22:17.475262 4848 generic.go:334] "Generic (PLEG): container finished" podID="f64da7f0-6afa-4c5f-913d-2127c033db81" containerID="e46dddfe058c9f83d61e184811904813d1c5ca82706101c2b95c6cdc97798712" exitCode=0 Feb 17 09:22:17 crc kubenswrapper[4848]: I0217 09:22:17.475420 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b55f48d49-zm6nm" event={"ID":"f64da7f0-6afa-4c5f-913d-2127c033db81","Type":"ContainerDied","Data":"e46dddfe058c9f83d61e184811904813d1c5ca82706101c2b95c6cdc97798712"} Feb 17 09:22:17 crc kubenswrapper[4848]: I0217 09:22:17.479556 4848 generic.go:334] "Generic (PLEG): container finished" podID="78406f6e-a154-4a75-96f6-99f77e092176" containerID="27c292dc0f347a6aec6815206c97764234ac510790a8a3434e2edda99e2a412d" exitCode=0 Feb 17 09:22:17 crc kubenswrapper[4848]: I0217 09:22:17.479632 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" event={"ID":"78406f6e-a154-4a75-96f6-99f77e092176","Type":"ContainerDied","Data":"27c292dc0f347a6aec6815206c97764234ac510790a8a3434e2edda99e2a412d"} Feb 17 09:22:17 crc kubenswrapper[4848]: I0217 09:22:17.479683 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" event={"ID":"78406f6e-a154-4a75-96f6-99f77e092176","Type":"ContainerStarted","Data":"2ff482fa86e0c155003bf91f25e9b15b3ebeb86e1516dce17e5bc64a41848835"} Feb 17 09:22:17 crc kubenswrapper[4848]: I0217 09:22:17.481430 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-65667fcd94-ngsp8" podStartSLOduration=3.481420345 podStartE2EDuration="3.481420345s" podCreationTimestamp="2026-02-17 09:22:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:22:17.479670445 +0000 UTC m=+1015.022926091" watchObservedRunningTime="2026-02-17 09:22:17.481420345 +0000 UTC m=+1015.024675991" Feb 17 09:22:17 crc kubenswrapper[4848]: I0217 09:22:17.520074 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7b6cdb54-tkxbl" event={"ID":"eba884ff-2e19-4dca-ba2e-75a8a311ea19","Type":"ContainerStarted","Data":"98c1edfc59fe271965b35adccea2bf624725b03dcb570cdc7d1eca6cc1ab09db"} Feb 17 09:22:17 crc kubenswrapper[4848]: I0217 09:22:17.520927 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:17 crc kubenswrapper[4848]: I0217 09:22:17.520955 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:17 crc kubenswrapper[4848]: I0217 09:22:17.526549 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dce8fa9b-bcca-459f-8483-60ed22b3e383","Type":"ContainerStarted","Data":"c91685c5d0783e69c71f354e2acf1f170c7efa357cae5be4708ea6e28a856706"} Feb 17 09:22:17 crc kubenswrapper[4848]: I0217 09:22:17.588375 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7b6cdb54-tkxbl" podStartSLOduration=3.5883537309999998 podStartE2EDuration="3.588353731s" podCreationTimestamp="2026-02-17 09:22:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:22:17.572206512 +0000 UTC m=+1015.115462158" watchObservedRunningTime="2026-02-17 09:22:17.588353731 +0000 UTC m=+1015.131609377" Feb 17 09:22:17 crc kubenswrapper[4848]: I0217 09:22:17.883132 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b55f48d49-zm6nm" Feb 17 09:22:17 crc kubenswrapper[4848]: I0217 09:22:17.902815 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.043325 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-dns-swift-storage-0\") pod \"f64da7f0-6afa-4c5f-913d-2127c033db81\" (UID: \"f64da7f0-6afa-4c5f-913d-2127c033db81\") " Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.043385 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-config\") pod \"f64da7f0-6afa-4c5f-913d-2127c033db81\" (UID: \"f64da7f0-6afa-4c5f-913d-2127c033db81\") " Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.043462 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk259\" (UniqueName: \"kubernetes.io/projected/f64da7f0-6afa-4c5f-913d-2127c033db81-kube-api-access-wk259\") pod \"f64da7f0-6afa-4c5f-913d-2127c033db81\" (UID: \"f64da7f0-6afa-4c5f-913d-2127c033db81\") " Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.043544 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-ovsdbserver-sb\") pod \"f64da7f0-6afa-4c5f-913d-2127c033db81\" (UID: \"f64da7f0-6afa-4c5f-913d-2127c033db81\") " Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.043561 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-ovsdbserver-nb\") pod \"f64da7f0-6afa-4c5f-913d-2127c033db81\" (UID: \"f64da7f0-6afa-4c5f-913d-2127c033db81\") " Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.043579 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-dns-svc\") pod \"f64da7f0-6afa-4c5f-913d-2127c033db81\" (UID: \"f64da7f0-6afa-4c5f-913d-2127c033db81\") " Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.048245 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f64da7f0-6afa-4c5f-913d-2127c033db81-kube-api-access-wk259" (OuterVolumeSpecName: "kube-api-access-wk259") pod "f64da7f0-6afa-4c5f-913d-2127c033db81" (UID: "f64da7f0-6afa-4c5f-913d-2127c033db81"). InnerVolumeSpecName "kube-api-access-wk259". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.068740 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f64da7f0-6afa-4c5f-913d-2127c033db81" (UID: "f64da7f0-6afa-4c5f-913d-2127c033db81"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.073336 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f64da7f0-6afa-4c5f-913d-2127c033db81" (UID: "f64da7f0-6afa-4c5f-913d-2127c033db81"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.084373 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f64da7f0-6afa-4c5f-913d-2127c033db81" (UID: "f64da7f0-6afa-4c5f-913d-2127c033db81"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.093050 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f64da7f0-6afa-4c5f-913d-2127c033db81" (UID: "f64da7f0-6afa-4c5f-913d-2127c033db81"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.099313 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-config" (OuterVolumeSpecName: "config") pod "f64da7f0-6afa-4c5f-913d-2127c033db81" (UID: "f64da7f0-6afa-4c5f-913d-2127c033db81"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.145447 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.145472 4848 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.145484 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.145494 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk259\" (UniqueName: \"kubernetes.io/projected/f64da7f0-6afa-4c5f-913d-2127c033db81-kube-api-access-wk259\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.145503 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.145512 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f64da7f0-6afa-4c5f-913d-2127c033db81-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.561264 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6d9a2cc0-eea8-401d-85db-7824e2ba0463","Type":"ContainerStarted","Data":"4513965a35748a27bcf0df3456b3d3d26b39d5aa26477a8740dd1e3c7c57b911"} Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.567142 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b55f48d49-zm6nm" event={"ID":"f64da7f0-6afa-4c5f-913d-2127c033db81","Type":"ContainerDied","Data":"3023274327d14f126d0bf4fef911c1bbde193d9855a9f09d8839bf17bda08958"} Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.567179 4848 scope.go:117] "RemoveContainer" containerID="e46dddfe058c9f83d61e184811904813d1c5ca82706101c2b95c6cdc97798712" Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.567279 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b55f48d49-zm6nm" Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.587616 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" event={"ID":"78406f6e-a154-4a75-96f6-99f77e092176","Type":"ContainerStarted","Data":"a74a6b83f8ac40260aa2adfaa182c9faa6bc65f4d668463b06789b35919903b1"} Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.588674 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.592693 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dce8fa9b-bcca-459f-8483-60ed22b3e383","Type":"ContainerStarted","Data":"9824427f4a21d14f10f34201135a47f2fc63c7ee06243f1f060d49280fe4a0ee"} Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.646825 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b55f48d49-zm6nm"] Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.653364 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b55f48d49-zm6nm"] Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.685562 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" podStartSLOduration=3.685538767 podStartE2EDuration="3.685538767s" podCreationTimestamp="2026-02-17 09:22:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:22:18.6591423 +0000 UTC m=+1016.202397946" watchObservedRunningTime="2026-02-17 09:22:18.685538767 +0000 UTC m=+1016.228794413" Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.772321 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:22:18 crc kubenswrapper[4848]: I0217 09:22:18.772368 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:22:18 crc kubenswrapper[4848]: E0217 09:22:18.799339 4848 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf64da7f0_6afa_4c5f_913d_2127c033db81.slice/crio-3023274327d14f126d0bf4fef911c1bbde193d9855a9f09d8839bf17bda08958\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf64da7f0_6afa_4c5f_913d_2127c033db81.slice\": RecentStats: unable to find data in memory cache]" Feb 17 09:22:19 crc kubenswrapper[4848]: I0217 09:22:19.417545 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f64da7f0-6afa-4c5f-913d-2127c033db81" path="/var/lib/kubelet/pods/f64da7f0-6afa-4c5f-913d-2127c033db81/volumes" Feb 17 09:22:19 crc kubenswrapper[4848]: I0217 09:22:19.614022 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-848f449699-2nhmn" event={"ID":"dd92adeb-535d-4d36-a176-b5cd3ca667dc","Type":"ContainerStarted","Data":"65c1ab531d67952ff227d28732a7ec9365ca47a1a14b79a068267fbb3ef1f56e"} Feb 17 09:22:19 crc kubenswrapper[4848]: I0217 09:22:19.617206 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c677d9df8-z5nnn" event={"ID":"49b9137d-75ca-4b52-9338-6bf15270a667","Type":"ContainerStarted","Data":"7acda22703257d8e61890b3019ca55b0b38e9db21a7792af4b1d8f67e81908d9"} Feb 17 09:22:19 crc kubenswrapper[4848]: I0217 09:22:19.624995 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="dce8fa9b-bcca-459f-8483-60ed22b3e383" containerName="cinder-api-log" containerID="cri-o://9824427f4a21d14f10f34201135a47f2fc63c7ee06243f1f060d49280fe4a0ee" gracePeriod=30 Feb 17 09:22:19 crc kubenswrapper[4848]: I0217 09:22:19.626021 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dce8fa9b-bcca-459f-8483-60ed22b3e383","Type":"ContainerStarted","Data":"ea41a36e7ca44ea317b4b608dccf5e4553bda0e03406a201fc474ac417a038b5"} Feb 17 09:22:19 crc kubenswrapper[4848]: I0217 09:22:19.627636 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 17 09:22:19 crc kubenswrapper[4848]: I0217 09:22:19.628120 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="dce8fa9b-bcca-459f-8483-60ed22b3e383" containerName="cinder-api" containerID="cri-o://ea41a36e7ca44ea317b4b608dccf5e4553bda0e03406a201fc474ac417a038b5" gracePeriod=30 Feb 17 09:22:19 crc kubenswrapper[4848]: I0217 09:22:19.655400 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.655383645 podStartE2EDuration="4.655383645s" podCreationTimestamp="2026-02-17 09:22:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:22:19.646803076 +0000 UTC m=+1017.190058742" watchObservedRunningTime="2026-02-17 09:22:19.655383645 +0000 UTC m=+1017.198639281" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.259397 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.396233 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dce8fa9b-bcca-459f-8483-60ed22b3e383-combined-ca-bundle\") pod \"dce8fa9b-bcca-459f-8483-60ed22b3e383\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.396379 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dce8fa9b-bcca-459f-8483-60ed22b3e383-scripts\") pod \"dce8fa9b-bcca-459f-8483-60ed22b3e383\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.396498 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dce8fa9b-bcca-459f-8483-60ed22b3e383-config-data\") pod \"dce8fa9b-bcca-459f-8483-60ed22b3e383\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.396563 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dce8fa9b-bcca-459f-8483-60ed22b3e383-logs\") pod \"dce8fa9b-bcca-459f-8483-60ed22b3e383\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.396587 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dce8fa9b-bcca-459f-8483-60ed22b3e383-etc-machine-id\") pod \"dce8fa9b-bcca-459f-8483-60ed22b3e383\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.396624 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnvhg\" (UniqueName: \"kubernetes.io/projected/dce8fa9b-bcca-459f-8483-60ed22b3e383-kube-api-access-tnvhg\") pod \"dce8fa9b-bcca-459f-8483-60ed22b3e383\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.396650 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dce8fa9b-bcca-459f-8483-60ed22b3e383-config-data-custom\") pod \"dce8fa9b-bcca-459f-8483-60ed22b3e383\" (UID: \"dce8fa9b-bcca-459f-8483-60ed22b3e383\") " Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.396775 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dce8fa9b-bcca-459f-8483-60ed22b3e383-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dce8fa9b-bcca-459f-8483-60ed22b3e383" (UID: "dce8fa9b-bcca-459f-8483-60ed22b3e383"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.396988 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dce8fa9b-bcca-459f-8483-60ed22b3e383-logs" (OuterVolumeSpecName: "logs") pod "dce8fa9b-bcca-459f-8483-60ed22b3e383" (UID: "dce8fa9b-bcca-459f-8483-60ed22b3e383"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.397301 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dce8fa9b-bcca-459f-8483-60ed22b3e383-logs\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.397321 4848 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dce8fa9b-bcca-459f-8483-60ed22b3e383-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.402871 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dce8fa9b-bcca-459f-8483-60ed22b3e383-kube-api-access-tnvhg" (OuterVolumeSpecName: "kube-api-access-tnvhg") pod "dce8fa9b-bcca-459f-8483-60ed22b3e383" (UID: "dce8fa9b-bcca-459f-8483-60ed22b3e383"). InnerVolumeSpecName "kube-api-access-tnvhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.411947 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dce8fa9b-bcca-459f-8483-60ed22b3e383-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dce8fa9b-bcca-459f-8483-60ed22b3e383" (UID: "dce8fa9b-bcca-459f-8483-60ed22b3e383"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.412891 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dce8fa9b-bcca-459f-8483-60ed22b3e383-scripts" (OuterVolumeSpecName: "scripts") pod "dce8fa9b-bcca-459f-8483-60ed22b3e383" (UID: "dce8fa9b-bcca-459f-8483-60ed22b3e383"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.428306 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dce8fa9b-bcca-459f-8483-60ed22b3e383-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dce8fa9b-bcca-459f-8483-60ed22b3e383" (UID: "dce8fa9b-bcca-459f-8483-60ed22b3e383"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.449975 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dce8fa9b-bcca-459f-8483-60ed22b3e383-config-data" (OuterVolumeSpecName: "config-data") pod "dce8fa9b-bcca-459f-8483-60ed22b3e383" (UID: "dce8fa9b-bcca-459f-8483-60ed22b3e383"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.498930 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dce8fa9b-bcca-459f-8483-60ed22b3e383-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.499217 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dce8fa9b-bcca-459f-8483-60ed22b3e383-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.499227 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dce8fa9b-bcca-459f-8483-60ed22b3e383-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.499239 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnvhg\" (UniqueName: \"kubernetes.io/projected/dce8fa9b-bcca-459f-8483-60ed22b3e383-kube-api-access-tnvhg\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.499249 4848 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dce8fa9b-bcca-459f-8483-60ed22b3e383-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.633797 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6d9a2cc0-eea8-401d-85db-7824e2ba0463","Type":"ContainerStarted","Data":"14214e79f728cd5082aa7a41736e11aa5222d422b77d8384aa975c0ab485daed"} Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.647096 4848 generic.go:334] "Generic (PLEG): container finished" podID="19facc80-e9df-42dc-8124-7619b2167b5c" containerID="d694ea51f368836cbddab6d27bab1acb6fcbace326a0b7a7bd157af6be65915f" exitCode=0 Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.647170 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19facc80-e9df-42dc-8124-7619b2167b5c","Type":"ContainerDied","Data":"d694ea51f368836cbddab6d27bab1acb6fcbace326a0b7a7bd157af6be65915f"} Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.652739 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-848f449699-2nhmn" event={"ID":"dd92adeb-535d-4d36-a176-b5cd3ca667dc","Type":"ContainerStarted","Data":"b6d19c19f753bf61b2710eab22003cb5b4a28f65e5a4ddbae182ff4be74ee57d"} Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.655645 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.738155889 podStartE2EDuration="5.655627566s" podCreationTimestamp="2026-02-17 09:22:15 +0000 UTC" firstStartedPulling="2026-02-17 09:22:16.64458309 +0000 UTC m=+1014.187838726" lastFinishedPulling="2026-02-17 09:22:17.562054757 +0000 UTC m=+1015.105310403" observedRunningTime="2026-02-17 09:22:20.652344501 +0000 UTC m=+1018.195600147" watchObservedRunningTime="2026-02-17 09:22:20.655627566 +0000 UTC m=+1018.198883212" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.664920 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6c677d9df8-z5nnn" event={"ID":"49b9137d-75ca-4b52-9338-6bf15270a667","Type":"ContainerStarted","Data":"e589bf4d977a82596736a57db79520a68bec14b7d6c20c6e7ec4b719b2279ecc"} Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.688687 4848 generic.go:334] "Generic (PLEG): container finished" podID="dce8fa9b-bcca-459f-8483-60ed22b3e383" containerID="ea41a36e7ca44ea317b4b608dccf5e4553bda0e03406a201fc474ac417a038b5" exitCode=0 Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.688719 4848 generic.go:334] "Generic (PLEG): container finished" podID="dce8fa9b-bcca-459f-8483-60ed22b3e383" containerID="9824427f4a21d14f10f34201135a47f2fc63c7ee06243f1f060d49280fe4a0ee" exitCode=143 Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.689375 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.689503 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dce8fa9b-bcca-459f-8483-60ed22b3e383","Type":"ContainerDied","Data":"ea41a36e7ca44ea317b4b608dccf5e4553bda0e03406a201fc474ac417a038b5"} Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.689529 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dce8fa9b-bcca-459f-8483-60ed22b3e383","Type":"ContainerDied","Data":"9824427f4a21d14f10f34201135a47f2fc63c7ee06243f1f060d49280fe4a0ee"} Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.689539 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"dce8fa9b-bcca-459f-8483-60ed22b3e383","Type":"ContainerDied","Data":"c91685c5d0783e69c71f354e2acf1f170c7efa357cae5be4708ea6e28a856706"} Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.689553 4848 scope.go:117] "RemoveContainer" containerID="ea41a36e7ca44ea317b4b608dccf5e4553bda0e03406a201fc474ac417a038b5" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.715031 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-848f449699-2nhmn" podStartSLOduration=3.445295076 podStartE2EDuration="6.715014751s" podCreationTimestamp="2026-02-17 09:22:14 +0000 UTC" firstStartedPulling="2026-02-17 09:22:15.95666781 +0000 UTC m=+1013.499923456" lastFinishedPulling="2026-02-17 09:22:19.226387475 +0000 UTC m=+1016.769643131" observedRunningTime="2026-02-17 09:22:20.675879304 +0000 UTC m=+1018.219134950" watchObservedRunningTime="2026-02-17 09:22:20.715014751 +0000 UTC m=+1018.258270397" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.753748 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6c677d9df8-z5nnn" podStartSLOduration=3.3473109 podStartE2EDuration="6.753720985s" podCreationTimestamp="2026-02-17 09:22:14 +0000 UTC" firstStartedPulling="2026-02-17 09:22:15.797854188 +0000 UTC m=+1013.341109834" lastFinishedPulling="2026-02-17 09:22:19.204264273 +0000 UTC m=+1016.747519919" observedRunningTime="2026-02-17 09:22:20.711698925 +0000 UTC m=+1018.254954571" watchObservedRunningTime="2026-02-17 09:22:20.753720985 +0000 UTC m=+1018.296976631" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.769302 4848 scope.go:117] "RemoveContainer" containerID="9824427f4a21d14f10f34201135a47f2fc63c7ee06243f1f060d49280fe4a0ee" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.786862 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.801134 4848 scope.go:117] "RemoveContainer" containerID="ea41a36e7ca44ea317b4b608dccf5e4553bda0e03406a201fc474ac417a038b5" Feb 17 09:22:20 crc kubenswrapper[4848]: E0217 09:22:20.807628 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea41a36e7ca44ea317b4b608dccf5e4553bda0e03406a201fc474ac417a038b5\": container with ID starting with ea41a36e7ca44ea317b4b608dccf5e4553bda0e03406a201fc474ac417a038b5 not found: ID does not exist" containerID="ea41a36e7ca44ea317b4b608dccf5e4553bda0e03406a201fc474ac417a038b5" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.807677 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea41a36e7ca44ea317b4b608dccf5e4553bda0e03406a201fc474ac417a038b5"} err="failed to get container status \"ea41a36e7ca44ea317b4b608dccf5e4553bda0e03406a201fc474ac417a038b5\": rpc error: code = NotFound desc = could not find container \"ea41a36e7ca44ea317b4b608dccf5e4553bda0e03406a201fc474ac417a038b5\": container with ID starting with ea41a36e7ca44ea317b4b608dccf5e4553bda0e03406a201fc474ac417a038b5 not found: ID does not exist" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.807703 4848 scope.go:117] "RemoveContainer" containerID="9824427f4a21d14f10f34201135a47f2fc63c7ee06243f1f060d49280fe4a0ee" Feb 17 09:22:20 crc kubenswrapper[4848]: E0217 09:22:20.809544 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9824427f4a21d14f10f34201135a47f2fc63c7ee06243f1f060d49280fe4a0ee\": container with ID starting with 9824427f4a21d14f10f34201135a47f2fc63c7ee06243f1f060d49280fe4a0ee not found: ID does not exist" containerID="9824427f4a21d14f10f34201135a47f2fc63c7ee06243f1f060d49280fe4a0ee" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.809588 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9824427f4a21d14f10f34201135a47f2fc63c7ee06243f1f060d49280fe4a0ee"} err="failed to get container status \"9824427f4a21d14f10f34201135a47f2fc63c7ee06243f1f060d49280fe4a0ee\": rpc error: code = NotFound desc = could not find container \"9824427f4a21d14f10f34201135a47f2fc63c7ee06243f1f060d49280fe4a0ee\": container with ID starting with 9824427f4a21d14f10f34201135a47f2fc63c7ee06243f1f060d49280fe4a0ee not found: ID does not exist" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.809615 4848 scope.go:117] "RemoveContainer" containerID="ea41a36e7ca44ea317b4b608dccf5e4553bda0e03406a201fc474ac417a038b5" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.809993 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea41a36e7ca44ea317b4b608dccf5e4553bda0e03406a201fc474ac417a038b5"} err="failed to get container status \"ea41a36e7ca44ea317b4b608dccf5e4553bda0e03406a201fc474ac417a038b5\": rpc error: code = NotFound desc = could not find container \"ea41a36e7ca44ea317b4b608dccf5e4553bda0e03406a201fc474ac417a038b5\": container with ID starting with ea41a36e7ca44ea317b4b608dccf5e4553bda0e03406a201fc474ac417a038b5 not found: ID does not exist" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.810015 4848 scope.go:117] "RemoveContainer" containerID="9824427f4a21d14f10f34201135a47f2fc63c7ee06243f1f060d49280fe4a0ee" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.810487 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9824427f4a21d14f10f34201135a47f2fc63c7ee06243f1f060d49280fe4a0ee"} err="failed to get container status \"9824427f4a21d14f10f34201135a47f2fc63c7ee06243f1f060d49280fe4a0ee\": rpc error: code = NotFound desc = could not find container \"9824427f4a21d14f10f34201135a47f2fc63c7ee06243f1f060d49280fe4a0ee\": container with ID starting with 9824427f4a21d14f10f34201135a47f2fc63c7ee06243f1f060d49280fe4a0ee not found: ID does not exist" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.811839 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.822284 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.822717 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 17 09:22:20 crc kubenswrapper[4848]: E0217 09:22:20.823193 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19facc80-e9df-42dc-8124-7619b2167b5c" containerName="sg-core" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.823260 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="19facc80-e9df-42dc-8124-7619b2167b5c" containerName="sg-core" Feb 17 09:22:20 crc kubenswrapper[4848]: E0217 09:22:20.823330 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dce8fa9b-bcca-459f-8483-60ed22b3e383" containerName="cinder-api-log" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.823382 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="dce8fa9b-bcca-459f-8483-60ed22b3e383" containerName="cinder-api-log" Feb 17 09:22:20 crc kubenswrapper[4848]: E0217 09:22:20.823435 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19facc80-e9df-42dc-8124-7619b2167b5c" containerName="ceilometer-notification-agent" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.823483 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="19facc80-e9df-42dc-8124-7619b2167b5c" containerName="ceilometer-notification-agent" Feb 17 09:22:20 crc kubenswrapper[4848]: E0217 09:22:20.823548 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19facc80-e9df-42dc-8124-7619b2167b5c" containerName="proxy-httpd" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.823611 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="19facc80-e9df-42dc-8124-7619b2167b5c" containerName="proxy-httpd" Feb 17 09:22:20 crc kubenswrapper[4848]: E0217 09:22:20.823692 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f64da7f0-6afa-4c5f-913d-2127c033db81" containerName="init" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.823769 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f64da7f0-6afa-4c5f-913d-2127c033db81" containerName="init" Feb 17 09:22:20 crc kubenswrapper[4848]: E0217 09:22:20.823835 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dce8fa9b-bcca-459f-8483-60ed22b3e383" containerName="cinder-api" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.823890 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="dce8fa9b-bcca-459f-8483-60ed22b3e383" containerName="cinder-api" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.824112 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="19facc80-e9df-42dc-8124-7619b2167b5c" containerName="proxy-httpd" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.824688 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f64da7f0-6afa-4c5f-913d-2127c033db81" containerName="init" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.824749 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="dce8fa9b-bcca-459f-8483-60ed22b3e383" containerName="cinder-api" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.824825 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="dce8fa9b-bcca-459f-8483-60ed22b3e383" containerName="cinder-api-log" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.824905 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="19facc80-e9df-42dc-8124-7619b2167b5c" containerName="sg-core" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.824962 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="19facc80-e9df-42dc-8124-7619b2167b5c" containerName="ceilometer-notification-agent" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.826073 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.831243 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.831473 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.835224 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.838044 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.906001 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.907958 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19facc80-e9df-42dc-8124-7619b2167b5c-sg-core-conf-yaml\") pod \"19facc80-e9df-42dc-8124-7619b2167b5c\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.907991 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19facc80-e9df-42dc-8124-7619b2167b5c-scripts\") pod \"19facc80-e9df-42dc-8124-7619b2167b5c\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.908079 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19facc80-e9df-42dc-8124-7619b2167b5c-config-data\") pod \"19facc80-e9df-42dc-8124-7619b2167b5c\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.908144 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19facc80-e9df-42dc-8124-7619b2167b5c-combined-ca-bundle\") pod \"19facc80-e9df-42dc-8124-7619b2167b5c\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.908176 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45bz8\" (UniqueName: \"kubernetes.io/projected/19facc80-e9df-42dc-8124-7619b2167b5c-kube-api-access-45bz8\") pod \"19facc80-e9df-42dc-8124-7619b2167b5c\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.908225 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19facc80-e9df-42dc-8124-7619b2167b5c-log-httpd\") pod \"19facc80-e9df-42dc-8124-7619b2167b5c\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.908294 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19facc80-e9df-42dc-8124-7619b2167b5c-run-httpd\") pod \"19facc80-e9df-42dc-8124-7619b2167b5c\" (UID: \"19facc80-e9df-42dc-8124-7619b2167b5c\") " Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.908484 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a637d69f-499a-4308-89a8-fad8fe4e6d59-logs\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.908508 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a637d69f-499a-4308-89a8-fad8fe4e6d59-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.908539 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a637d69f-499a-4308-89a8-fad8fe4e6d59-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.908558 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg42d\" (UniqueName: \"kubernetes.io/projected/a637d69f-499a-4308-89a8-fad8fe4e6d59-kube-api-access-qg42d\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.908587 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a637d69f-499a-4308-89a8-fad8fe4e6d59-config-data-custom\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.908604 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a637d69f-499a-4308-89a8-fad8fe4e6d59-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.908621 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a637d69f-499a-4308-89a8-fad8fe4e6d59-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.908685 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a637d69f-499a-4308-89a8-fad8fe4e6d59-scripts\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.908703 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a637d69f-499a-4308-89a8-fad8fe4e6d59-config-data\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.909075 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19facc80-e9df-42dc-8124-7619b2167b5c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "19facc80-e9df-42dc-8124-7619b2167b5c" (UID: "19facc80-e9df-42dc-8124-7619b2167b5c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.909087 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19facc80-e9df-42dc-8124-7619b2167b5c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "19facc80-e9df-42dc-8124-7619b2167b5c" (UID: "19facc80-e9df-42dc-8124-7619b2167b5c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.914106 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19facc80-e9df-42dc-8124-7619b2167b5c-kube-api-access-45bz8" (OuterVolumeSpecName: "kube-api-access-45bz8") pod "19facc80-e9df-42dc-8124-7619b2167b5c" (UID: "19facc80-e9df-42dc-8124-7619b2167b5c"). InnerVolumeSpecName "kube-api-access-45bz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.920679 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19facc80-e9df-42dc-8124-7619b2167b5c-scripts" (OuterVolumeSpecName: "scripts") pod "19facc80-e9df-42dc-8124-7619b2167b5c" (UID: "19facc80-e9df-42dc-8124-7619b2167b5c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.948248 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19facc80-e9df-42dc-8124-7619b2167b5c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "19facc80-e9df-42dc-8124-7619b2167b5c" (UID: "19facc80-e9df-42dc-8124-7619b2167b5c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.973488 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19facc80-e9df-42dc-8124-7619b2167b5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19facc80-e9df-42dc-8124-7619b2167b5c" (UID: "19facc80-e9df-42dc-8124-7619b2167b5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:20 crc kubenswrapper[4848]: I0217 09:22:20.987871 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19facc80-e9df-42dc-8124-7619b2167b5c-config-data" (OuterVolumeSpecName: "config-data") pod "19facc80-e9df-42dc-8124-7619b2167b5c" (UID: "19facc80-e9df-42dc-8124-7619b2167b5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.011660 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a637d69f-499a-4308-89a8-fad8fe4e6d59-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.011958 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg42d\" (UniqueName: \"kubernetes.io/projected/a637d69f-499a-4308-89a8-fad8fe4e6d59-kube-api-access-qg42d\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.011993 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a637d69f-499a-4308-89a8-fad8fe4e6d59-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.012007 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a637d69f-499a-4308-89a8-fad8fe4e6d59-config-data-custom\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.012022 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a637d69f-499a-4308-89a8-fad8fe4e6d59-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.012057 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a637d69f-499a-4308-89a8-fad8fe4e6d59-etc-machine-id\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.012112 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a637d69f-499a-4308-89a8-fad8fe4e6d59-scripts\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.012135 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a637d69f-499a-4308-89a8-fad8fe4e6d59-config-data\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.012193 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a637d69f-499a-4308-89a8-fad8fe4e6d59-logs\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.012211 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a637d69f-499a-4308-89a8-fad8fe4e6d59-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.012267 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19facc80-e9df-42dc-8124-7619b2167b5c-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.012279 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19facc80-e9df-42dc-8124-7619b2167b5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.012292 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45bz8\" (UniqueName: \"kubernetes.io/projected/19facc80-e9df-42dc-8124-7619b2167b5c-kube-api-access-45bz8\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.012301 4848 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19facc80-e9df-42dc-8124-7619b2167b5c-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.012310 4848 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/19facc80-e9df-42dc-8124-7619b2167b5c-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.012318 4848 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/19facc80-e9df-42dc-8124-7619b2167b5c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.012328 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19facc80-e9df-42dc-8124-7619b2167b5c-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.012880 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a637d69f-499a-4308-89a8-fad8fe4e6d59-logs\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.017328 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a637d69f-499a-4308-89a8-fad8fe4e6d59-config-data-custom\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.017653 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a637d69f-499a-4308-89a8-fad8fe4e6d59-config-data\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.017734 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a637d69f-499a-4308-89a8-fad8fe4e6d59-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.018290 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a637d69f-499a-4308-89a8-fad8fe4e6d59-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.018718 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a637d69f-499a-4308-89a8-fad8fe4e6d59-scripts\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.018987 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a637d69f-499a-4308-89a8-fad8fe4e6d59-public-tls-certs\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.030288 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg42d\" (UniqueName: \"kubernetes.io/projected/a637d69f-499a-4308-89a8-fad8fe4e6d59-kube-api-access-qg42d\") pod \"cinder-api-0\" (UID: \"a637d69f-499a-4308-89a8-fad8fe4e6d59\") " pod="openstack/cinder-api-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.141678 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-56c75f4b6d-vbll8"] Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.143242 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.145527 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.147818 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.149593 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.159546 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56c75f4b6d-vbll8"] Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.215440 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d2c1b3-b181-4afe-8cb1-3049a34c47d2-internal-tls-certs\") pod \"barbican-api-56c75f4b6d-vbll8\" (UID: \"63d2c1b3-b181-4afe-8cb1-3049a34c47d2\") " pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.215653 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63d2c1b3-b181-4afe-8cb1-3049a34c47d2-config-data-custom\") pod \"barbican-api-56c75f4b6d-vbll8\" (UID: \"63d2c1b3-b181-4afe-8cb1-3049a34c47d2\") " pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.215752 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63d2c1b3-b181-4afe-8cb1-3049a34c47d2-logs\") pod \"barbican-api-56c75f4b6d-vbll8\" (UID: \"63d2c1b3-b181-4afe-8cb1-3049a34c47d2\") " pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.215854 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d2c1b3-b181-4afe-8cb1-3049a34c47d2-combined-ca-bundle\") pod \"barbican-api-56c75f4b6d-vbll8\" (UID: \"63d2c1b3-b181-4afe-8cb1-3049a34c47d2\") " pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.215938 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lngx6\" (UniqueName: \"kubernetes.io/projected/63d2c1b3-b181-4afe-8cb1-3049a34c47d2-kube-api-access-lngx6\") pod \"barbican-api-56c75f4b6d-vbll8\" (UID: \"63d2c1b3-b181-4afe-8cb1-3049a34c47d2\") " pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.216046 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d2c1b3-b181-4afe-8cb1-3049a34c47d2-public-tls-certs\") pod \"barbican-api-56c75f4b6d-vbll8\" (UID: \"63d2c1b3-b181-4afe-8cb1-3049a34c47d2\") " pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.216123 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d2c1b3-b181-4afe-8cb1-3049a34c47d2-config-data\") pod \"barbican-api-56c75f4b6d-vbll8\" (UID: \"63d2c1b3-b181-4afe-8cb1-3049a34c47d2\") " pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.317879 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lngx6\" (UniqueName: \"kubernetes.io/projected/63d2c1b3-b181-4afe-8cb1-3049a34c47d2-kube-api-access-lngx6\") pod \"barbican-api-56c75f4b6d-vbll8\" (UID: \"63d2c1b3-b181-4afe-8cb1-3049a34c47d2\") " pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.317975 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d2c1b3-b181-4afe-8cb1-3049a34c47d2-public-tls-certs\") pod \"barbican-api-56c75f4b6d-vbll8\" (UID: \"63d2c1b3-b181-4afe-8cb1-3049a34c47d2\") " pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.318022 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d2c1b3-b181-4afe-8cb1-3049a34c47d2-config-data\") pod \"barbican-api-56c75f4b6d-vbll8\" (UID: \"63d2c1b3-b181-4afe-8cb1-3049a34c47d2\") " pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.318099 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d2c1b3-b181-4afe-8cb1-3049a34c47d2-internal-tls-certs\") pod \"barbican-api-56c75f4b6d-vbll8\" (UID: \"63d2c1b3-b181-4afe-8cb1-3049a34c47d2\") " pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.318150 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63d2c1b3-b181-4afe-8cb1-3049a34c47d2-config-data-custom\") pod \"barbican-api-56c75f4b6d-vbll8\" (UID: \"63d2c1b3-b181-4afe-8cb1-3049a34c47d2\") " pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.318195 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63d2c1b3-b181-4afe-8cb1-3049a34c47d2-logs\") pod \"barbican-api-56c75f4b6d-vbll8\" (UID: \"63d2c1b3-b181-4afe-8cb1-3049a34c47d2\") " pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.318236 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d2c1b3-b181-4afe-8cb1-3049a34c47d2-combined-ca-bundle\") pod \"barbican-api-56c75f4b6d-vbll8\" (UID: \"63d2c1b3-b181-4afe-8cb1-3049a34c47d2\") " pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.321205 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63d2c1b3-b181-4afe-8cb1-3049a34c47d2-logs\") pod \"barbican-api-56c75f4b6d-vbll8\" (UID: \"63d2c1b3-b181-4afe-8cb1-3049a34c47d2\") " pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.327129 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d2c1b3-b181-4afe-8cb1-3049a34c47d2-public-tls-certs\") pod \"barbican-api-56c75f4b6d-vbll8\" (UID: \"63d2c1b3-b181-4afe-8cb1-3049a34c47d2\") " pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.327458 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63d2c1b3-b181-4afe-8cb1-3049a34c47d2-internal-tls-certs\") pod \"barbican-api-56c75f4b6d-vbll8\" (UID: \"63d2c1b3-b181-4afe-8cb1-3049a34c47d2\") " pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.327899 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63d2c1b3-b181-4afe-8cb1-3049a34c47d2-combined-ca-bundle\") pod \"barbican-api-56c75f4b6d-vbll8\" (UID: \"63d2c1b3-b181-4afe-8cb1-3049a34c47d2\") " pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.328353 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/63d2c1b3-b181-4afe-8cb1-3049a34c47d2-config-data-custom\") pod \"barbican-api-56c75f4b6d-vbll8\" (UID: \"63d2c1b3-b181-4afe-8cb1-3049a34c47d2\") " pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.349847 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63d2c1b3-b181-4afe-8cb1-3049a34c47d2-config-data\") pod \"barbican-api-56c75f4b6d-vbll8\" (UID: \"63d2c1b3-b181-4afe-8cb1-3049a34c47d2\") " pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.350908 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lngx6\" (UniqueName: \"kubernetes.io/projected/63d2c1b3-b181-4afe-8cb1-3049a34c47d2-kube-api-access-lngx6\") pod \"barbican-api-56c75f4b6d-vbll8\" (UID: \"63d2c1b3-b181-4afe-8cb1-3049a34c47d2\") " pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.400938 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dce8fa9b-bcca-459f-8483-60ed22b3e383" path="/var/lib/kubelet/pods/dce8fa9b-bcca-459f-8483-60ed22b3e383/volumes" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.454798 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6cb4bc9fb8-r52gj" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.531930 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.694459 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.704921 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"19facc80-e9df-42dc-8124-7619b2167b5c","Type":"ContainerDied","Data":"e2c00f7696e586e0bd4794022b2ae6dbd600c53ad91c67947bdaaf029dfc8b03"} Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.704967 4848 scope.go:117] "RemoveContainer" containerID="007d29b079b19764f9d7719f3e34df2f08cfcf6b91488bdcdc51c036c2ca9bc1" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.705366 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.767091 4848 scope.go:117] "RemoveContainer" containerID="41572746e34d178fd02daf01a2fe3cb41b45a45e18e1670540f5d2845cab6370" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.814127 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.849802 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.880610 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.882742 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.892870 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.895213 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.895614 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.914546 4848 scope.go:117] "RemoveContainer" containerID="d694ea51f368836cbddab6d27bab1acb6fcbace326a0b7a7bd157af6be65915f" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.928321 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5df76f45d5-hgxv9"] Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.928550 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5df76f45d5-hgxv9" podUID="e37f2d41-bace-4ca3-a811-36b44ee278d4" containerName="neutron-api" containerID="cri-o://7b28b1b7d740ab64c25cb38758987da296dc7a5cbf9b99c3e43634575486d07b" gracePeriod=30 Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.936009 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5df76f45d5-hgxv9" podUID="e37f2d41-bace-4ca3-a811-36b44ee278d4" containerName="neutron-httpd" containerID="cri-o://32eabd0988df57cdd830fc15c053b758ac4faa7582ea065e28d9c0b5f4c53649" gracePeriod=30 Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.946632 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65acfebd-febe-42db-8441-065b06724681-scripts\") pod \"ceilometer-0\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " pod="openstack/ceilometer-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.946707 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65acfebd-febe-42db-8441-065b06724681-config-data\") pod \"ceilometer-0\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " pod="openstack/ceilometer-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.946735 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdjpw\" (UniqueName: \"kubernetes.io/projected/65acfebd-febe-42db-8441-065b06724681-kube-api-access-qdjpw\") pod \"ceilometer-0\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " pod="openstack/ceilometer-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.946824 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65acfebd-febe-42db-8441-065b06724681-log-httpd\") pod \"ceilometer-0\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " pod="openstack/ceilometer-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.946889 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65acfebd-febe-42db-8441-065b06724681-run-httpd\") pod \"ceilometer-0\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " pod="openstack/ceilometer-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.946933 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65acfebd-febe-42db-8441-065b06724681-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " pod="openstack/ceilometer-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.946962 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65acfebd-febe-42db-8441-065b06724681-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " pod="openstack/ceilometer-0" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.966951 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5df76f45d5-hgxv9" podUID="e37f2d41-bace-4ca3-a811-36b44ee278d4" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.155:9696/\": EOF" Feb 17 09:22:21 crc kubenswrapper[4848]: I0217 09:22:21.986473 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-685b8f6845-8tvq5"] Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.003241 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-685b8f6845-8tvq5" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.029920 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-685b8f6845-8tvq5"] Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.053092 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65acfebd-febe-42db-8441-065b06724681-scripts\") pod \"ceilometer-0\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " pod="openstack/ceilometer-0" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.053145 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65acfebd-febe-42db-8441-065b06724681-config-data\") pod \"ceilometer-0\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " pod="openstack/ceilometer-0" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.053164 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdjpw\" (UniqueName: \"kubernetes.io/projected/65acfebd-febe-42db-8441-065b06724681-kube-api-access-qdjpw\") pod \"ceilometer-0\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " pod="openstack/ceilometer-0" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.053194 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65acfebd-febe-42db-8441-065b06724681-log-httpd\") pod \"ceilometer-0\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " pod="openstack/ceilometer-0" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.053239 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65acfebd-febe-42db-8441-065b06724681-run-httpd\") pod \"ceilometer-0\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " pod="openstack/ceilometer-0" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.053274 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65acfebd-febe-42db-8441-065b06724681-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " pod="openstack/ceilometer-0" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.053298 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65acfebd-febe-42db-8441-065b06724681-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " pod="openstack/ceilometer-0" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.062375 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65acfebd-febe-42db-8441-065b06724681-run-httpd\") pod \"ceilometer-0\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " pod="openstack/ceilometer-0" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.062691 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65acfebd-febe-42db-8441-065b06724681-log-httpd\") pod \"ceilometer-0\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " pod="openstack/ceilometer-0" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.066643 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65acfebd-febe-42db-8441-065b06724681-scripts\") pod \"ceilometer-0\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " pod="openstack/ceilometer-0" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.070434 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65acfebd-febe-42db-8441-065b06724681-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " pod="openstack/ceilometer-0" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.079409 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65acfebd-febe-42db-8441-065b06724681-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " pod="openstack/ceilometer-0" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.091805 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65acfebd-febe-42db-8441-065b06724681-config-data\") pod \"ceilometer-0\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " pod="openstack/ceilometer-0" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.115964 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdjpw\" (UniqueName: \"kubernetes.io/projected/65acfebd-febe-42db-8441-065b06724681-kube-api-access-qdjpw\") pod \"ceilometer-0\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " pod="openstack/ceilometer-0" Feb 17 09:22:22 crc kubenswrapper[4848]: W0217 09:22:22.147193 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63d2c1b3_b181_4afe_8cb1_3049a34c47d2.slice/crio-7f091369b5b1cb40b6d32cf3c4bebda306f3cf7b46e72ad8a0d73285b88f6a25 WatchSource:0}: Error finding container 7f091369b5b1cb40b6d32cf3c4bebda306f3cf7b46e72ad8a0d73285b88f6a25: Status 404 returned error can't find the container with id 7f091369b5b1cb40b6d32cf3c4bebda306f3cf7b46e72ad8a0d73285b88f6a25 Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.150565 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56c75f4b6d-vbll8"] Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.154814 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e060d08b-cb90-4fe6-badb-ae482aeb505d-public-tls-certs\") pod \"neutron-685b8f6845-8tvq5\" (UID: \"e060d08b-cb90-4fe6-badb-ae482aeb505d\") " pod="openstack/neutron-685b8f6845-8tvq5" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.154854 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e060d08b-cb90-4fe6-badb-ae482aeb505d-config\") pod \"neutron-685b8f6845-8tvq5\" (UID: \"e060d08b-cb90-4fe6-badb-ae482aeb505d\") " pod="openstack/neutron-685b8f6845-8tvq5" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.154917 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e060d08b-cb90-4fe6-badb-ae482aeb505d-combined-ca-bundle\") pod \"neutron-685b8f6845-8tvq5\" (UID: \"e060d08b-cb90-4fe6-badb-ae482aeb505d\") " pod="openstack/neutron-685b8f6845-8tvq5" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.154950 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e060d08b-cb90-4fe6-badb-ae482aeb505d-ovndb-tls-certs\") pod \"neutron-685b8f6845-8tvq5\" (UID: \"e060d08b-cb90-4fe6-badb-ae482aeb505d\") " pod="openstack/neutron-685b8f6845-8tvq5" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.155006 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzbxn\" (UniqueName: \"kubernetes.io/projected/e060d08b-cb90-4fe6-badb-ae482aeb505d-kube-api-access-xzbxn\") pod \"neutron-685b8f6845-8tvq5\" (UID: \"e060d08b-cb90-4fe6-badb-ae482aeb505d\") " pod="openstack/neutron-685b8f6845-8tvq5" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.155037 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e060d08b-cb90-4fe6-badb-ae482aeb505d-internal-tls-certs\") pod \"neutron-685b8f6845-8tvq5\" (UID: \"e060d08b-cb90-4fe6-badb-ae482aeb505d\") " pod="openstack/neutron-685b8f6845-8tvq5" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.155092 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e060d08b-cb90-4fe6-badb-ae482aeb505d-httpd-config\") pod \"neutron-685b8f6845-8tvq5\" (UID: \"e060d08b-cb90-4fe6-badb-ae482aeb505d\") " pod="openstack/neutron-685b8f6845-8tvq5" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.257030 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e060d08b-cb90-4fe6-badb-ae482aeb505d-httpd-config\") pod \"neutron-685b8f6845-8tvq5\" (UID: \"e060d08b-cb90-4fe6-badb-ae482aeb505d\") " pod="openstack/neutron-685b8f6845-8tvq5" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.257379 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e060d08b-cb90-4fe6-badb-ae482aeb505d-public-tls-certs\") pod \"neutron-685b8f6845-8tvq5\" (UID: \"e060d08b-cb90-4fe6-badb-ae482aeb505d\") " pod="openstack/neutron-685b8f6845-8tvq5" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.257401 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e060d08b-cb90-4fe6-badb-ae482aeb505d-config\") pod \"neutron-685b8f6845-8tvq5\" (UID: \"e060d08b-cb90-4fe6-badb-ae482aeb505d\") " pod="openstack/neutron-685b8f6845-8tvq5" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.257442 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e060d08b-cb90-4fe6-badb-ae482aeb505d-combined-ca-bundle\") pod \"neutron-685b8f6845-8tvq5\" (UID: \"e060d08b-cb90-4fe6-badb-ae482aeb505d\") " pod="openstack/neutron-685b8f6845-8tvq5" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.257471 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e060d08b-cb90-4fe6-badb-ae482aeb505d-ovndb-tls-certs\") pod \"neutron-685b8f6845-8tvq5\" (UID: \"e060d08b-cb90-4fe6-badb-ae482aeb505d\") " pod="openstack/neutron-685b8f6845-8tvq5" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.257521 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzbxn\" (UniqueName: \"kubernetes.io/projected/e060d08b-cb90-4fe6-badb-ae482aeb505d-kube-api-access-xzbxn\") pod \"neutron-685b8f6845-8tvq5\" (UID: \"e060d08b-cb90-4fe6-badb-ae482aeb505d\") " pod="openstack/neutron-685b8f6845-8tvq5" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.257543 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e060d08b-cb90-4fe6-badb-ae482aeb505d-internal-tls-certs\") pod \"neutron-685b8f6845-8tvq5\" (UID: \"e060d08b-cb90-4fe6-badb-ae482aeb505d\") " pod="openstack/neutron-685b8f6845-8tvq5" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.263559 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e060d08b-cb90-4fe6-badb-ae482aeb505d-internal-tls-certs\") pod \"neutron-685b8f6845-8tvq5\" (UID: \"e060d08b-cb90-4fe6-badb-ae482aeb505d\") " pod="openstack/neutron-685b8f6845-8tvq5" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.264128 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e060d08b-cb90-4fe6-badb-ae482aeb505d-combined-ca-bundle\") pod \"neutron-685b8f6845-8tvq5\" (UID: \"e060d08b-cb90-4fe6-badb-ae482aeb505d\") " pod="openstack/neutron-685b8f6845-8tvq5" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.265443 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e060d08b-cb90-4fe6-badb-ae482aeb505d-ovndb-tls-certs\") pod \"neutron-685b8f6845-8tvq5\" (UID: \"e060d08b-cb90-4fe6-badb-ae482aeb505d\") " pod="openstack/neutron-685b8f6845-8tvq5" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.269317 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e060d08b-cb90-4fe6-badb-ae482aeb505d-httpd-config\") pod \"neutron-685b8f6845-8tvq5\" (UID: \"e060d08b-cb90-4fe6-badb-ae482aeb505d\") " pod="openstack/neutron-685b8f6845-8tvq5" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.273665 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e060d08b-cb90-4fe6-badb-ae482aeb505d-public-tls-certs\") pod \"neutron-685b8f6845-8tvq5\" (UID: \"e060d08b-cb90-4fe6-badb-ae482aeb505d\") " pod="openstack/neutron-685b8f6845-8tvq5" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.273960 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e060d08b-cb90-4fe6-badb-ae482aeb505d-config\") pod \"neutron-685b8f6845-8tvq5\" (UID: \"e060d08b-cb90-4fe6-badb-ae482aeb505d\") " pod="openstack/neutron-685b8f6845-8tvq5" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.279869 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzbxn\" (UniqueName: \"kubernetes.io/projected/e060d08b-cb90-4fe6-badb-ae482aeb505d-kube-api-access-xzbxn\") pod \"neutron-685b8f6845-8tvq5\" (UID: \"e060d08b-cb90-4fe6-badb-ae482aeb505d\") " pod="openstack/neutron-685b8f6845-8tvq5" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.285338 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.391457 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-685b8f6845-8tvq5" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.733345 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56c75f4b6d-vbll8" event={"ID":"63d2c1b3-b181-4afe-8cb1-3049a34c47d2","Type":"ContainerStarted","Data":"ff36d74dfbd246cdc721cc8f897b3c44865755666f82dde782958176fdc6de23"} Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.733603 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56c75f4b6d-vbll8" event={"ID":"63d2c1b3-b181-4afe-8cb1-3049a34c47d2","Type":"ContainerStarted","Data":"7f091369b5b1cb40b6d32cf3c4bebda306f3cf7b46e72ad8a0d73285b88f6a25"} Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.751495 4848 generic.go:334] "Generic (PLEG): container finished" podID="f8226a25-977f-4934-b40b-7504ab8f23e4" containerID="276455a72da624622d24fe647d83eb57d4649914bdffc56aef69370957d54dad" exitCode=137 Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.751527 4848 generic.go:334] "Generic (PLEG): container finished" podID="f8226a25-977f-4934-b40b-7504ab8f23e4" containerID="5f3c2dcbbc6ea1b986ce34c527d927661d7665ba59bc45f1b819435a7303a7c4" exitCode=137 Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.751572 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-775dd6779c-j9k6n" event={"ID":"f8226a25-977f-4934-b40b-7504ab8f23e4","Type":"ContainerDied","Data":"276455a72da624622d24fe647d83eb57d4649914bdffc56aef69370957d54dad"} Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.751601 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-775dd6779c-j9k6n" event={"ID":"f8226a25-977f-4934-b40b-7504ab8f23e4","Type":"ContainerDied","Data":"5f3c2dcbbc6ea1b986ce34c527d927661d7665ba59bc45f1b819435a7303a7c4"} Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.755570 4848 generic.go:334] "Generic (PLEG): container finished" podID="d790c2ab-67aa-4c46-9407-9fa991223dd0" containerID="6993accf01497b8287cf6562a92fd53c6e9d1c384ab683be59ec133035931f6f" exitCode=137 Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.755603 4848 generic.go:334] "Generic (PLEG): container finished" podID="d790c2ab-67aa-4c46-9407-9fa991223dd0" containerID="0af8c9459567215b1bb5f1aca4bdbe716fa1f64aedf6b3a4c8c6ce8750551728" exitCode=137 Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.755668 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57bbdfc68f-j9b8f" event={"ID":"d790c2ab-67aa-4c46-9407-9fa991223dd0","Type":"ContainerDied","Data":"6993accf01497b8287cf6562a92fd53c6e9d1c384ab683be59ec133035931f6f"} Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.755693 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57bbdfc68f-j9b8f" event={"ID":"d790c2ab-67aa-4c46-9407-9fa991223dd0","Type":"ContainerDied","Data":"0af8c9459567215b1bb5f1aca4bdbe716fa1f64aedf6b3a4c8c6ce8750551728"} Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.757385 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a637d69f-499a-4308-89a8-fad8fe4e6d59","Type":"ContainerStarted","Data":"40b62d458d583120ae85464afff0ece9f0cc4505515f3b98a97e890aa34d93a3"} Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.774700 4848 generic.go:334] "Generic (PLEG): container finished" podID="adcdc585-f670-4bc3-be54-acf796d438df" containerID="558fcd89e85a1796d184735aafca39c6f5a56a859a594d7ec3af0dd0a0436a8b" exitCode=137 Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.774735 4848 generic.go:334] "Generic (PLEG): container finished" podID="adcdc585-f670-4bc3-be54-acf796d438df" containerID="129fe4ebbb776d354652314ab286f6852b2eda985baaa608ebb67534453727d9" exitCode=137 Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.775627 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844c79cc9c-crdch" event={"ID":"adcdc585-f670-4bc3-be54-acf796d438df","Type":"ContainerDied","Data":"558fcd89e85a1796d184735aafca39c6f5a56a859a594d7ec3af0dd0a0436a8b"} Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.775663 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844c79cc9c-crdch" event={"ID":"adcdc585-f670-4bc3-be54-acf796d438df","Type":"ContainerDied","Data":"129fe4ebbb776d354652314ab286f6852b2eda985baaa608ebb67534453727d9"} Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.875639 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-775dd6779c-j9k6n" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.974275 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8226a25-977f-4934-b40b-7504ab8f23e4-scripts\") pod \"f8226a25-977f-4934-b40b-7504ab8f23e4\" (UID: \"f8226a25-977f-4934-b40b-7504ab8f23e4\") " Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.974604 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8226a25-977f-4934-b40b-7504ab8f23e4-config-data\") pod \"f8226a25-977f-4934-b40b-7504ab8f23e4\" (UID: \"f8226a25-977f-4934-b40b-7504ab8f23e4\") " Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.974722 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8226a25-977f-4934-b40b-7504ab8f23e4-logs\") pod \"f8226a25-977f-4934-b40b-7504ab8f23e4\" (UID: \"f8226a25-977f-4934-b40b-7504ab8f23e4\") " Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.974747 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f8226a25-977f-4934-b40b-7504ab8f23e4-horizon-secret-key\") pod \"f8226a25-977f-4934-b40b-7504ab8f23e4\" (UID: \"f8226a25-977f-4934-b40b-7504ab8f23e4\") " Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.974808 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fk6fp\" (UniqueName: \"kubernetes.io/projected/f8226a25-977f-4934-b40b-7504ab8f23e4-kube-api-access-fk6fp\") pod \"f8226a25-977f-4934-b40b-7504ab8f23e4\" (UID: \"f8226a25-977f-4934-b40b-7504ab8f23e4\") " Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.977320 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8226a25-977f-4934-b40b-7504ab8f23e4-logs" (OuterVolumeSpecName: "logs") pod "f8226a25-977f-4934-b40b-7504ab8f23e4" (UID: "f8226a25-977f-4934-b40b-7504ab8f23e4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:22:22 crc kubenswrapper[4848]: I0217 09:22:22.989336 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8226a25-977f-4934-b40b-7504ab8f23e4-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "f8226a25-977f-4934-b40b-7504ab8f23e4" (UID: "f8226a25-977f-4934-b40b-7504ab8f23e4"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:22.997932 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8226a25-977f-4934-b40b-7504ab8f23e4-kube-api-access-fk6fp" (OuterVolumeSpecName: "kube-api-access-fk6fp") pod "f8226a25-977f-4934-b40b-7504ab8f23e4" (UID: "f8226a25-977f-4934-b40b-7504ab8f23e4"). InnerVolumeSpecName "kube-api-access-fk6fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.013334 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8226a25-977f-4934-b40b-7504ab8f23e4-scripts" (OuterVolumeSpecName: "scripts") pod "f8226a25-977f-4934-b40b-7504ab8f23e4" (UID: "f8226a25-977f-4934-b40b-7504ab8f23e4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.028988 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8226a25-977f-4934-b40b-7504ab8f23e4-config-data" (OuterVolumeSpecName: "config-data") pod "f8226a25-977f-4934-b40b-7504ab8f23e4" (UID: "f8226a25-977f-4934-b40b-7504ab8f23e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.083123 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8226a25-977f-4934-b40b-7504ab8f23e4-logs\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.083145 4848 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/f8226a25-977f-4934-b40b-7504ab8f23e4-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.083155 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fk6fp\" (UniqueName: \"kubernetes.io/projected/f8226a25-977f-4934-b40b-7504ab8f23e4-kube-api-access-fk6fp\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.083164 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f8226a25-977f-4934-b40b-7504ab8f23e4-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.083173 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8226a25-977f-4934-b40b-7504ab8f23e4-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.093035 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-844c79cc9c-crdch" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.121481 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57bbdfc68f-j9b8f" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.184338 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5hc76\" (UniqueName: \"kubernetes.io/projected/adcdc585-f670-4bc3-be54-acf796d438df-kube-api-access-5hc76\") pod \"adcdc585-f670-4bc3-be54-acf796d438df\" (UID: \"adcdc585-f670-4bc3-be54-acf796d438df\") " Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.184404 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d790c2ab-67aa-4c46-9407-9fa991223dd0-config-data\") pod \"d790c2ab-67aa-4c46-9407-9fa991223dd0\" (UID: \"d790c2ab-67aa-4c46-9407-9fa991223dd0\") " Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.184439 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d790c2ab-67aa-4c46-9407-9fa991223dd0-logs\") pod \"d790c2ab-67aa-4c46-9407-9fa991223dd0\" (UID: \"d790c2ab-67aa-4c46-9407-9fa991223dd0\") " Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.184514 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adcdc585-f670-4bc3-be54-acf796d438df-scripts\") pod \"adcdc585-f670-4bc3-be54-acf796d438df\" (UID: \"adcdc585-f670-4bc3-be54-acf796d438df\") " Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.184546 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adcdc585-f670-4bc3-be54-acf796d438df-logs\") pod \"adcdc585-f670-4bc3-be54-acf796d438df\" (UID: \"adcdc585-f670-4bc3-be54-acf796d438df\") " Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.184588 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d790c2ab-67aa-4c46-9407-9fa991223dd0-scripts\") pod \"d790c2ab-67aa-4c46-9407-9fa991223dd0\" (UID: \"d790c2ab-67aa-4c46-9407-9fa991223dd0\") " Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.184613 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdj88\" (UniqueName: \"kubernetes.io/projected/d790c2ab-67aa-4c46-9407-9fa991223dd0-kube-api-access-mdj88\") pod \"d790c2ab-67aa-4c46-9407-9fa991223dd0\" (UID: \"d790c2ab-67aa-4c46-9407-9fa991223dd0\") " Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.184659 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d790c2ab-67aa-4c46-9407-9fa991223dd0-horizon-secret-key\") pod \"d790c2ab-67aa-4c46-9407-9fa991223dd0\" (UID: \"d790c2ab-67aa-4c46-9407-9fa991223dd0\") " Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.184678 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/adcdc585-f670-4bc3-be54-acf796d438df-horizon-secret-key\") pod \"adcdc585-f670-4bc3-be54-acf796d438df\" (UID: \"adcdc585-f670-4bc3-be54-acf796d438df\") " Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.184736 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/adcdc585-f670-4bc3-be54-acf796d438df-config-data\") pod \"adcdc585-f670-4bc3-be54-acf796d438df\" (UID: \"adcdc585-f670-4bc3-be54-acf796d438df\") " Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.185746 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/adcdc585-f670-4bc3-be54-acf796d438df-logs" (OuterVolumeSpecName: "logs") pod "adcdc585-f670-4bc3-be54-acf796d438df" (UID: "adcdc585-f670-4bc3-be54-acf796d438df"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.186024 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d790c2ab-67aa-4c46-9407-9fa991223dd0-logs" (OuterVolumeSpecName: "logs") pod "d790c2ab-67aa-4c46-9407-9fa991223dd0" (UID: "d790c2ab-67aa-4c46-9407-9fa991223dd0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.218563 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d790c2ab-67aa-4c46-9407-9fa991223dd0-kube-api-access-mdj88" (OuterVolumeSpecName: "kube-api-access-mdj88") pod "d790c2ab-67aa-4c46-9407-9fa991223dd0" (UID: "d790c2ab-67aa-4c46-9407-9fa991223dd0"). InnerVolumeSpecName "kube-api-access-mdj88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.222914 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d790c2ab-67aa-4c46-9407-9fa991223dd0-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "d790c2ab-67aa-4c46-9407-9fa991223dd0" (UID: "d790c2ab-67aa-4c46-9407-9fa991223dd0"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.222919 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adcdc585-f670-4bc3-be54-acf796d438df-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "adcdc585-f670-4bc3-be54-acf796d438df" (UID: "adcdc585-f670-4bc3-be54-acf796d438df"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.223005 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adcdc585-f670-4bc3-be54-acf796d438df-kube-api-access-5hc76" (OuterVolumeSpecName: "kube-api-access-5hc76") pod "adcdc585-f670-4bc3-be54-acf796d438df" (UID: "adcdc585-f670-4bc3-be54-acf796d438df"). InnerVolumeSpecName "kube-api-access-5hc76". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.236370 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d790c2ab-67aa-4c46-9407-9fa991223dd0-config-data" (OuterVolumeSpecName: "config-data") pod "d790c2ab-67aa-4c46-9407-9fa991223dd0" (UID: "d790c2ab-67aa-4c46-9407-9fa991223dd0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.246396 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adcdc585-f670-4bc3-be54-acf796d438df-config-data" (OuterVolumeSpecName: "config-data") pod "adcdc585-f670-4bc3-be54-acf796d438df" (UID: "adcdc585-f670-4bc3-be54-acf796d438df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.253205 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d790c2ab-67aa-4c46-9407-9fa991223dd0-scripts" (OuterVolumeSpecName: "scripts") pod "d790c2ab-67aa-4c46-9407-9fa991223dd0" (UID: "d790c2ab-67aa-4c46-9407-9fa991223dd0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.265372 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.277246 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adcdc585-f670-4bc3-be54-acf796d438df-scripts" (OuterVolumeSpecName: "scripts") pod "adcdc585-f670-4bc3-be54-acf796d438df" (UID: "adcdc585-f670-4bc3-be54-acf796d438df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.288273 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5hc76\" (UniqueName: \"kubernetes.io/projected/adcdc585-f670-4bc3-be54-acf796d438df-kube-api-access-5hc76\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.288307 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d790c2ab-67aa-4c46-9407-9fa991223dd0-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.288316 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d790c2ab-67aa-4c46-9407-9fa991223dd0-logs\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.288324 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/adcdc585-f670-4bc3-be54-acf796d438df-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.288332 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/adcdc585-f670-4bc3-be54-acf796d438df-logs\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.288340 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d790c2ab-67aa-4c46-9407-9fa991223dd0-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.288348 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdj88\" (UniqueName: \"kubernetes.io/projected/d790c2ab-67aa-4c46-9407-9fa991223dd0-kube-api-access-mdj88\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.288356 4848 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d790c2ab-67aa-4c46-9407-9fa991223dd0-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.288364 4848 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/adcdc585-f670-4bc3-be54-acf796d438df-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.288372 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/adcdc585-f670-4bc3-be54-acf796d438df-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:23 crc kubenswrapper[4848]: W0217 09:22:23.476890 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode060d08b_cb90_4fe6_badb_ae482aeb505d.slice/crio-87fb4283fb3857f3522531111ce3ab04ec1c2d240e06e83d8847e17f6ae8a785 WatchSource:0}: Error finding container 87fb4283fb3857f3522531111ce3ab04ec1c2d240e06e83d8847e17f6ae8a785: Status 404 returned error can't find the container with id 87fb4283fb3857f3522531111ce3ab04ec1c2d240e06e83d8847e17f6ae8a785 Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.478434 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-5df76f45d5-hgxv9" podUID="e37f2d41-bace-4ca3-a811-36b44ee278d4" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.155:9696/\": dial tcp 10.217.0.155:9696: connect: connection refused" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.481549 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19facc80-e9df-42dc-8124-7619b2167b5c" path="/var/lib/kubelet/pods/19facc80-e9df-42dc-8124-7619b2167b5c/volumes" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.490115 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-685b8f6845-8tvq5"] Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.790532 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5df76f45d5-hgxv9" event={"ID":"e37f2d41-bace-4ca3-a811-36b44ee278d4","Type":"ContainerDied","Data":"32eabd0988df57cdd830fc15c053b758ac4faa7582ea065e28d9c0b5f4c53649"} Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.790675 4848 generic.go:334] "Generic (PLEG): container finished" podID="e37f2d41-bace-4ca3-a811-36b44ee278d4" containerID="32eabd0988df57cdd830fc15c053b758ac4faa7582ea065e28d9c0b5f4c53649" exitCode=0 Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.795785 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a637d69f-499a-4308-89a8-fad8fe4e6d59","Type":"ContainerStarted","Data":"bcce9425f6f4341af63b7e5725d98c68e2a108458839bca5893b0a65638fb44f"} Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.795885 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"a637d69f-499a-4308-89a8-fad8fe4e6d59","Type":"ContainerStarted","Data":"4240ba200369704d57982b40db3c6b253bf66757e1eab11d63c3ec6579ea61ce"} Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.796191 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.798118 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844c79cc9c-crdch" event={"ID":"adcdc585-f670-4bc3-be54-acf796d438df","Type":"ContainerDied","Data":"ea40469db3244034845afbf0ff0731b52963ca757a73f28346fe2fe701edaaa5"} Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.798161 4848 scope.go:117] "RemoveContainer" containerID="558fcd89e85a1796d184735aafca39c6f5a56a859a594d7ec3af0dd0a0436a8b" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.798290 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-844c79cc9c-crdch" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.801893 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-685b8f6845-8tvq5" event={"ID":"e060d08b-cb90-4fe6-badb-ae482aeb505d","Type":"ContainerStarted","Data":"a865270869c89d85c35324268a3796957346046d2967ac13a45db3d217a24648"} Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.801926 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-685b8f6845-8tvq5" event={"ID":"e060d08b-cb90-4fe6-badb-ae482aeb505d","Type":"ContainerStarted","Data":"87fb4283fb3857f3522531111ce3ab04ec1c2d240e06e83d8847e17f6ae8a785"} Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.804106 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65acfebd-febe-42db-8441-065b06724681","Type":"ContainerStarted","Data":"1d4d95c874b8c554543aa4564d519795eb8b53eac2ff74e3257f1ee86205096b"} Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.806519 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56c75f4b6d-vbll8" event={"ID":"63d2c1b3-b181-4afe-8cb1-3049a34c47d2","Type":"ContainerStarted","Data":"65cddc02df6f55dc35a3e2fe10906d19426fed75ec7a962c025115cc4123a9d6"} Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.806603 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.806698 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.811069 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-775dd6779c-j9k6n" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.811557 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-775dd6779c-j9k6n" event={"ID":"f8226a25-977f-4934-b40b-7504ab8f23e4","Type":"ContainerDied","Data":"2d302a0145acf1b92e15433961d8672c9ac65c76bbb2eb46adcb8070e872fbcb"} Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.824689 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-57bbdfc68f-j9b8f" event={"ID":"d790c2ab-67aa-4c46-9407-9fa991223dd0","Type":"ContainerDied","Data":"834cca69968e35eaae5f0294086e2e803f3a83063138d1a657c7a2dabcdbb3f1"} Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.824844 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-57bbdfc68f-j9b8f" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.835630 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.835609555 podStartE2EDuration="3.835609555s" podCreationTimestamp="2026-02-17 09:22:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:22:23.821367421 +0000 UTC m=+1021.364623077" watchObservedRunningTime="2026-02-17 09:22:23.835609555 +0000 UTC m=+1021.378865201" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.858497 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-56c75f4b6d-vbll8" podStartSLOduration=2.858480169 podStartE2EDuration="2.858480169s" podCreationTimestamp="2026-02-17 09:22:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:22:23.857232803 +0000 UTC m=+1021.400488469" watchObservedRunningTime="2026-02-17 09:22:23.858480169 +0000 UTC m=+1021.401735815" Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.887830 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-844c79cc9c-crdch"] Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.926597 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-844c79cc9c-crdch"] Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.937804 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-775dd6779c-j9k6n"] Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.946633 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-775dd6779c-j9k6n"] Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.959950 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-57bbdfc68f-j9b8f"] Feb 17 09:22:23 crc kubenswrapper[4848]: I0217 09:22:23.972067 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-57bbdfc68f-j9b8f"] Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.057304 4848 scope.go:117] "RemoveContainer" containerID="129fe4ebbb776d354652314ab286f6852b2eda985baaa608ebb67534453727d9" Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.081039 4848 scope.go:117] "RemoveContainer" containerID="276455a72da624622d24fe647d83eb57d4649914bdffc56aef69370957d54dad" Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.271927 4848 scope.go:117] "RemoveContainer" containerID="5f3c2dcbbc6ea1b986ce34c527d927661d7665ba59bc45f1b819435a7303a7c4" Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.360819 4848 scope.go:117] "RemoveContainer" containerID="6993accf01497b8287cf6562a92fd53c6e9d1c384ab683be59ec133035931f6f" Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.505282 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.543186 4848 scope.go:117] "RemoveContainer" containerID="0af8c9459567215b1bb5f1aca4bdbe716fa1f64aedf6b3a4c8c6ce8750551728" Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.711261 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.845673 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.846175 4848 generic.go:334] "Generic (PLEG): container finished" podID="e37f2d41-bace-4ca3-a811-36b44ee278d4" containerID="7b28b1b7d740ab64c25cb38758987da296dc7a5cbf9b99c3e43634575486d07b" exitCode=0 Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.846243 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5df76f45d5-hgxv9" event={"ID":"e37f2d41-bace-4ca3-a811-36b44ee278d4","Type":"ContainerDied","Data":"7b28b1b7d740ab64c25cb38758987da296dc7a5cbf9b99c3e43634575486d07b"} Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.846271 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5df76f45d5-hgxv9" event={"ID":"e37f2d41-bace-4ca3-a811-36b44ee278d4","Type":"ContainerDied","Data":"e0cfcb15b1d4aaa58b2591c7e99cd1ba80de439c93dc40cba571166b780b4d51"} Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.846297 4848 scope.go:117] "RemoveContainer" containerID="32eabd0988df57cdd830fc15c053b758ac4faa7582ea065e28d9c0b5f4c53649" Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.862039 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-685b8f6845-8tvq5" event={"ID":"e060d08b-cb90-4fe6-badb-ae482aeb505d","Type":"ContainerStarted","Data":"2b246946faaaa1bfbacc373ef4c37e94db767974f95bb0f518854da4dca0501f"} Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.862139 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-685b8f6845-8tvq5" Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.909963 4848 scope.go:117] "RemoveContainer" containerID="7b28b1b7d740ab64c25cb38758987da296dc7a5cbf9b99c3e43634575486d07b" Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.941644 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-685b8f6845-8tvq5" podStartSLOduration=3.941627998 podStartE2EDuration="3.941627998s" podCreationTimestamp="2026-02-17 09:22:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:22:24.908522917 +0000 UTC m=+1022.451778573" watchObservedRunningTime="2026-02-17 09:22:24.941627998 +0000 UTC m=+1022.484883644" Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.945009 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-795rc\" (UniqueName: \"kubernetes.io/projected/e37f2d41-bace-4ca3-a811-36b44ee278d4-kube-api-access-795rc\") pod \"e37f2d41-bace-4ca3-a811-36b44ee278d4\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.945070 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-combined-ca-bundle\") pod \"e37f2d41-bace-4ca3-a811-36b44ee278d4\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.945171 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-ovndb-tls-certs\") pod \"e37f2d41-bace-4ca3-a811-36b44ee278d4\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.945226 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-httpd-config\") pod \"e37f2d41-bace-4ca3-a811-36b44ee278d4\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.945310 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-config\") pod \"e37f2d41-bace-4ca3-a811-36b44ee278d4\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.945342 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-public-tls-certs\") pod \"e37f2d41-bace-4ca3-a811-36b44ee278d4\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.945359 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-internal-tls-certs\") pod \"e37f2d41-bace-4ca3-a811-36b44ee278d4\" (UID: \"e37f2d41-bace-4ca3-a811-36b44ee278d4\") " Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.950885 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e37f2d41-bace-4ca3-a811-36b44ee278d4-kube-api-access-795rc" (OuterVolumeSpecName: "kube-api-access-795rc") pod "e37f2d41-bace-4ca3-a811-36b44ee278d4" (UID: "e37f2d41-bace-4ca3-a811-36b44ee278d4"). InnerVolumeSpecName "kube-api-access-795rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.960987 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "e37f2d41-bace-4ca3-a811-36b44ee278d4" (UID: "e37f2d41-bace-4ca3-a811-36b44ee278d4"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.969155 4848 scope.go:117] "RemoveContainer" containerID="32eabd0988df57cdd830fc15c053b758ac4faa7582ea065e28d9c0b5f4c53649" Feb 17 09:22:24 crc kubenswrapper[4848]: E0217 09:22:24.971061 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32eabd0988df57cdd830fc15c053b758ac4faa7582ea065e28d9c0b5f4c53649\": container with ID starting with 32eabd0988df57cdd830fc15c053b758ac4faa7582ea065e28d9c0b5f4c53649 not found: ID does not exist" containerID="32eabd0988df57cdd830fc15c053b758ac4faa7582ea065e28d9c0b5f4c53649" Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.971095 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32eabd0988df57cdd830fc15c053b758ac4faa7582ea065e28d9c0b5f4c53649"} err="failed to get container status \"32eabd0988df57cdd830fc15c053b758ac4faa7582ea065e28d9c0b5f4c53649\": rpc error: code = NotFound desc = could not find container \"32eabd0988df57cdd830fc15c053b758ac4faa7582ea065e28d9c0b5f4c53649\": container with ID starting with 32eabd0988df57cdd830fc15c053b758ac4faa7582ea065e28d9c0b5f4c53649 not found: ID does not exist" Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.971116 4848 scope.go:117] "RemoveContainer" containerID="7b28b1b7d740ab64c25cb38758987da296dc7a5cbf9b99c3e43634575486d07b" Feb 17 09:22:24 crc kubenswrapper[4848]: E0217 09:22:24.972534 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b28b1b7d740ab64c25cb38758987da296dc7a5cbf9b99c3e43634575486d07b\": container with ID starting with 7b28b1b7d740ab64c25cb38758987da296dc7a5cbf9b99c3e43634575486d07b not found: ID does not exist" containerID="7b28b1b7d740ab64c25cb38758987da296dc7a5cbf9b99c3e43634575486d07b" Feb 17 09:22:24 crc kubenswrapper[4848]: I0217 09:22:24.972555 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b28b1b7d740ab64c25cb38758987da296dc7a5cbf9b99c3e43634575486d07b"} err="failed to get container status \"7b28b1b7d740ab64c25cb38758987da296dc7a5cbf9b99c3e43634575486d07b\": rpc error: code = NotFound desc = could not find container \"7b28b1b7d740ab64c25cb38758987da296dc7a5cbf9b99c3e43634575486d07b\": container with ID starting with 7b28b1b7d740ab64c25cb38758987da296dc7a5cbf9b99c3e43634575486d07b not found: ID does not exist" Feb 17 09:22:25 crc kubenswrapper[4848]: I0217 09:22:25.020622 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e37f2d41-bace-4ca3-a811-36b44ee278d4" (UID: "e37f2d41-bace-4ca3-a811-36b44ee278d4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:25 crc kubenswrapper[4848]: I0217 09:22:25.028701 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e37f2d41-bace-4ca3-a811-36b44ee278d4" (UID: "e37f2d41-bace-4ca3-a811-36b44ee278d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:25 crc kubenswrapper[4848]: I0217 09:22:25.030424 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-config" (OuterVolumeSpecName: "config") pod "e37f2d41-bace-4ca3-a811-36b44ee278d4" (UID: "e37f2d41-bace-4ca3-a811-36b44ee278d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:25 crc kubenswrapper[4848]: I0217 09:22:25.047917 4848 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:25 crc kubenswrapper[4848]: I0217 09:22:25.047949 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:25 crc kubenswrapper[4848]: I0217 09:22:25.047960 4848 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:25 crc kubenswrapper[4848]: I0217 09:22:25.047970 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-795rc\" (UniqueName: \"kubernetes.io/projected/e37f2d41-bace-4ca3-a811-36b44ee278d4-kube-api-access-795rc\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:25 crc kubenswrapper[4848]: I0217 09:22:25.047980 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:25 crc kubenswrapper[4848]: I0217 09:22:25.048043 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e37f2d41-bace-4ca3-a811-36b44ee278d4" (UID: "e37f2d41-bace-4ca3-a811-36b44ee278d4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:25 crc kubenswrapper[4848]: I0217 09:22:25.068988 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "e37f2d41-bace-4ca3-a811-36b44ee278d4" (UID: "e37f2d41-bace-4ca3-a811-36b44ee278d4"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:25 crc kubenswrapper[4848]: I0217 09:22:25.149608 4848 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:25 crc kubenswrapper[4848]: I0217 09:22:25.149647 4848 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e37f2d41-bace-4ca3-a811-36b44ee278d4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:25 crc kubenswrapper[4848]: I0217 09:22:25.410972 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adcdc585-f670-4bc3-be54-acf796d438df" path="/var/lib/kubelet/pods/adcdc585-f670-4bc3-be54-acf796d438df/volumes" Feb 17 09:22:25 crc kubenswrapper[4848]: I0217 09:22:25.411791 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d790c2ab-67aa-4c46-9407-9fa991223dd0" path="/var/lib/kubelet/pods/d790c2ab-67aa-4c46-9407-9fa991223dd0/volumes" Feb 17 09:22:25 crc kubenswrapper[4848]: I0217 09:22:25.413228 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8226a25-977f-4934-b40b-7504ab8f23e4" path="/var/lib/kubelet/pods/f8226a25-977f-4934-b40b-7504ab8f23e4/volumes" Feb 17 09:22:25 crc kubenswrapper[4848]: I0217 09:22:25.890989 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65acfebd-febe-42db-8441-065b06724681","Type":"ContainerStarted","Data":"33cb62653bf607b598e80d536c84915fc8ebed23b57517a1fe4ae00d0262d473"} Feb 17 09:22:25 crc kubenswrapper[4848]: I0217 09:22:25.891250 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65acfebd-febe-42db-8441-065b06724681","Type":"ContainerStarted","Data":"e7152b88ec933da1a2c6d7bc7a639190e9aeb9bc89fdcce26d10c9b357f6a5e8"} Feb 17 09:22:25 crc kubenswrapper[4848]: I0217 09:22:25.892032 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5df76f45d5-hgxv9" Feb 17 09:22:25 crc kubenswrapper[4848]: I0217 09:22:25.926636 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5df76f45d5-hgxv9"] Feb 17 09:22:25 crc kubenswrapper[4848]: I0217 09:22:25.939707 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5df76f45d5-hgxv9"] Feb 17 09:22:25 crc kubenswrapper[4848]: I0217 09:22:25.955187 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.030196 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f455b5fc7-2s8ss"] Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.030475 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" podUID="72029462-f77a-48a6-8fcc-49d2e9ab7046" containerName="dnsmasq-dns" containerID="cri-o://ea745fa85b8e3b13ed9fb943d71c5ad63bbc9aef48550100f2b90a443b5b3cba" gracePeriod=10 Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.219390 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.294521 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.329946 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" podUID="72029462-f77a-48a6-8fcc-49d2e9ab7046" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.153:5353: connect: connection refused" Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.665506 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.688389 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-dns-svc\") pod \"72029462-f77a-48a6-8fcc-49d2e9ab7046\" (UID: \"72029462-f77a-48a6-8fcc-49d2e9ab7046\") " Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.688478 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-config\") pod \"72029462-f77a-48a6-8fcc-49d2e9ab7046\" (UID: \"72029462-f77a-48a6-8fcc-49d2e9ab7046\") " Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.688553 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-ovsdbserver-nb\") pod \"72029462-f77a-48a6-8fcc-49d2e9ab7046\" (UID: \"72029462-f77a-48a6-8fcc-49d2e9ab7046\") " Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.688576 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-dns-swift-storage-0\") pod \"72029462-f77a-48a6-8fcc-49d2e9ab7046\" (UID: \"72029462-f77a-48a6-8fcc-49d2e9ab7046\") " Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.688619 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-ovsdbserver-sb\") pod \"72029462-f77a-48a6-8fcc-49d2e9ab7046\" (UID: \"72029462-f77a-48a6-8fcc-49d2e9ab7046\") " Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.688653 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvmlc\" (UniqueName: \"kubernetes.io/projected/72029462-f77a-48a6-8fcc-49d2e9ab7046-kube-api-access-gvmlc\") pod \"72029462-f77a-48a6-8fcc-49d2e9ab7046\" (UID: \"72029462-f77a-48a6-8fcc-49d2e9ab7046\") " Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.702906 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72029462-f77a-48a6-8fcc-49d2e9ab7046-kube-api-access-gvmlc" (OuterVolumeSpecName: "kube-api-access-gvmlc") pod "72029462-f77a-48a6-8fcc-49d2e9ab7046" (UID: "72029462-f77a-48a6-8fcc-49d2e9ab7046"). InnerVolumeSpecName "kube-api-access-gvmlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.765553 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-config" (OuterVolumeSpecName: "config") pod "72029462-f77a-48a6-8fcc-49d2e9ab7046" (UID: "72029462-f77a-48a6-8fcc-49d2e9ab7046"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.772265 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "72029462-f77a-48a6-8fcc-49d2e9ab7046" (UID: "72029462-f77a-48a6-8fcc-49d2e9ab7046"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.783878 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "72029462-f77a-48a6-8fcc-49d2e9ab7046" (UID: "72029462-f77a-48a6-8fcc-49d2e9ab7046"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.784521 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "72029462-f77a-48a6-8fcc-49d2e9ab7046" (UID: "72029462-f77a-48a6-8fcc-49d2e9ab7046"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.790900 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.791143 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.791248 4848 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.791328 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.791392 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvmlc\" (UniqueName: \"kubernetes.io/projected/72029462-f77a-48a6-8fcc-49d2e9ab7046-kube-api-access-gvmlc\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.818393 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "72029462-f77a-48a6-8fcc-49d2e9ab7046" (UID: "72029462-f77a-48a6-8fcc-49d2e9ab7046"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.893533 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/72029462-f77a-48a6-8fcc-49d2e9ab7046-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.900680 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65acfebd-febe-42db-8441-065b06724681","Type":"ContainerStarted","Data":"fe2fb4796981bf23222fd2c45c8056b0de9d974652b09f75edd5fe50e3b32922"} Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.902136 4848 generic.go:334] "Generic (PLEG): container finished" podID="72029462-f77a-48a6-8fcc-49d2e9ab7046" containerID="ea745fa85b8e3b13ed9fb943d71c5ad63bbc9aef48550100f2b90a443b5b3cba" exitCode=0 Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.902180 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" event={"ID":"72029462-f77a-48a6-8fcc-49d2e9ab7046","Type":"ContainerDied","Data":"ea745fa85b8e3b13ed9fb943d71c5ad63bbc9aef48550100f2b90a443b5b3cba"} Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.902295 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" event={"ID":"72029462-f77a-48a6-8fcc-49d2e9ab7046","Type":"ContainerDied","Data":"c2215ee1bd9cfd1830fabe03d586071621d75d7b1e56f81bacdcab8aa694a3aa"} Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.902319 4848 scope.go:117] "RemoveContainer" containerID="ea745fa85b8e3b13ed9fb943d71c5ad63bbc9aef48550100f2b90a443b5b3cba" Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.902207 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f455b5fc7-2s8ss" Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.902839 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6d9a2cc0-eea8-401d-85db-7824e2ba0463" containerName="cinder-scheduler" containerID="cri-o://4513965a35748a27bcf0df3456b3d3d26b39d5aa26477a8740dd1e3c7c57b911" gracePeriod=30 Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.902857 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="6d9a2cc0-eea8-401d-85db-7824e2ba0463" containerName="probe" containerID="cri-o://14214e79f728cd5082aa7a41736e11aa5222d422b77d8384aa975c0ab485daed" gracePeriod=30 Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.912804 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.920299 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-676bdd79dd-lq228" Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.925119 4848 scope.go:117] "RemoveContainer" containerID="a504accb14792dc54990de89dfec862cd8b5606f69498576ad67ed994807df6b" Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.967573 4848 scope.go:117] "RemoveContainer" containerID="ea745fa85b8e3b13ed9fb943d71c5ad63bbc9aef48550100f2b90a443b5b3cba" Feb 17 09:22:26 crc kubenswrapper[4848]: E0217 09:22:26.970945 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea745fa85b8e3b13ed9fb943d71c5ad63bbc9aef48550100f2b90a443b5b3cba\": container with ID starting with ea745fa85b8e3b13ed9fb943d71c5ad63bbc9aef48550100f2b90a443b5b3cba not found: ID does not exist" containerID="ea745fa85b8e3b13ed9fb943d71c5ad63bbc9aef48550100f2b90a443b5b3cba" Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.970979 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea745fa85b8e3b13ed9fb943d71c5ad63bbc9aef48550100f2b90a443b5b3cba"} err="failed to get container status \"ea745fa85b8e3b13ed9fb943d71c5ad63bbc9aef48550100f2b90a443b5b3cba\": rpc error: code = NotFound desc = could not find container \"ea745fa85b8e3b13ed9fb943d71c5ad63bbc9aef48550100f2b90a443b5b3cba\": container with ID starting with ea745fa85b8e3b13ed9fb943d71c5ad63bbc9aef48550100f2b90a443b5b3cba not found: ID does not exist" Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.971002 4848 scope.go:117] "RemoveContainer" containerID="a504accb14792dc54990de89dfec862cd8b5606f69498576ad67ed994807df6b" Feb 17 09:22:26 crc kubenswrapper[4848]: E0217 09:22:26.974019 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a504accb14792dc54990de89dfec862cd8b5606f69498576ad67ed994807df6b\": container with ID starting with a504accb14792dc54990de89dfec862cd8b5606f69498576ad67ed994807df6b not found: ID does not exist" containerID="a504accb14792dc54990de89dfec862cd8b5606f69498576ad67ed994807df6b" Feb 17 09:22:26 crc kubenswrapper[4848]: I0217 09:22:26.974070 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a504accb14792dc54990de89dfec862cd8b5606f69498576ad67ed994807df6b"} err="failed to get container status \"a504accb14792dc54990de89dfec862cd8b5606f69498576ad67ed994807df6b\": rpc error: code = NotFound desc = could not find container \"a504accb14792dc54990de89dfec862cd8b5606f69498576ad67ed994807df6b\": container with ID starting with a504accb14792dc54990de89dfec862cd8b5606f69498576ad67ed994807df6b not found: ID does not exist" Feb 17 09:22:27 crc kubenswrapper[4848]: I0217 09:22:27.001410 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6f455b5fc7-2s8ss"] Feb 17 09:22:27 crc kubenswrapper[4848]: I0217 09:22:27.008302 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6f455b5fc7-2s8ss"] Feb 17 09:22:27 crc kubenswrapper[4848]: I0217 09:22:27.034696 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-749cc47784-q9crv"] Feb 17 09:22:27 crc kubenswrapper[4848]: I0217 09:22:27.066848 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-65667fcd94-ngsp8" Feb 17 09:22:27 crc kubenswrapper[4848]: I0217 09:22:27.401349 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72029462-f77a-48a6-8fcc-49d2e9ab7046" path="/var/lib/kubelet/pods/72029462-f77a-48a6-8fcc-49d2e9ab7046/volumes" Feb 17 09:22:27 crc kubenswrapper[4848]: I0217 09:22:27.401916 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e37f2d41-bace-4ca3-a811-36b44ee278d4" path="/var/lib/kubelet/pods/e37f2d41-bace-4ca3-a811-36b44ee278d4/volumes" Feb 17 09:22:27 crc kubenswrapper[4848]: I0217 09:22:27.587698 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-65667fcd94-ngsp8" Feb 17 09:22:27 crc kubenswrapper[4848]: I0217 09:22:27.912730 4848 generic.go:334] "Generic (PLEG): container finished" podID="6d9a2cc0-eea8-401d-85db-7824e2ba0463" containerID="14214e79f728cd5082aa7a41736e11aa5222d422b77d8384aa975c0ab485daed" exitCode=0 Feb 17 09:22:27 crc kubenswrapper[4848]: I0217 09:22:27.912778 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6d9a2cc0-eea8-401d-85db-7824e2ba0463","Type":"ContainerDied","Data":"14214e79f728cd5082aa7a41736e11aa5222d422b77d8384aa975c0ab485daed"} Feb 17 09:22:27 crc kubenswrapper[4848]: I0217 09:22:27.915276 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65acfebd-febe-42db-8441-065b06724681","Type":"ContainerStarted","Data":"4ef48c22d7d4092e05fc05a711d1666f0a05c60d1689ef9d76ffb2bd836ab3de"} Feb 17 09:22:27 crc kubenswrapper[4848]: I0217 09:22:27.915648 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-749cc47784-q9crv" podUID="1068aa99-55d4-4778-ac02-b354de25d16e" containerName="horizon-log" containerID="cri-o://479b497cdf95e27b9cf69855fe7ee2bce4facd9452020fe51dd99e0da1ac4e2a" gracePeriod=30 Feb 17 09:22:27 crc kubenswrapper[4848]: I0217 09:22:27.915701 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-749cc47784-q9crv" podUID="1068aa99-55d4-4778-ac02-b354de25d16e" containerName="horizon" containerID="cri-o://0cafaca3851618f0ffdb14a0cf40a0bb789ede4dd0cfff526ba29a414b7a3d5c" gracePeriod=30 Feb 17 09:22:27 crc kubenswrapper[4848]: I0217 09:22:27.945323 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.00792527 podStartE2EDuration="6.945307596s" podCreationTimestamp="2026-02-17 09:22:21 +0000 UTC" firstStartedPulling="2026-02-17 09:22:23.252906531 +0000 UTC m=+1020.796162177" lastFinishedPulling="2026-02-17 09:22:27.190288857 +0000 UTC m=+1024.733544503" observedRunningTime="2026-02-17 09:22:27.939991452 +0000 UTC m=+1025.483247098" watchObservedRunningTime="2026-02-17 09:22:27.945307596 +0000 UTC m=+1025.488563242" Feb 17 09:22:28 crc kubenswrapper[4848]: I0217 09:22:28.923704 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 09:22:29 crc kubenswrapper[4848]: I0217 09:22:29.936584 4848 generic.go:334] "Generic (PLEG): container finished" podID="6d9a2cc0-eea8-401d-85db-7824e2ba0463" containerID="4513965a35748a27bcf0df3456b3d3d26b39d5aa26477a8740dd1e3c7c57b911" exitCode=0 Feb 17 09:22:29 crc kubenswrapper[4848]: I0217 09:22:29.936622 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6d9a2cc0-eea8-401d-85db-7824e2ba0463","Type":"ContainerDied","Data":"4513965a35748a27bcf0df3456b3d3d26b39d5aa26477a8740dd1e3c7c57b911"} Feb 17 09:22:30 crc kubenswrapper[4848]: I0217 09:22:30.392881 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 09:22:30 crc kubenswrapper[4848]: I0217 09:22:30.481392 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9a2cc0-eea8-401d-85db-7824e2ba0463-combined-ca-bundle\") pod \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\" (UID: \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\") " Feb 17 09:22:30 crc kubenswrapper[4848]: I0217 09:22:30.481965 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqshj\" (UniqueName: \"kubernetes.io/projected/6d9a2cc0-eea8-401d-85db-7824e2ba0463-kube-api-access-sqshj\") pod \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\" (UID: \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\") " Feb 17 09:22:30 crc kubenswrapper[4848]: I0217 09:22:30.482069 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d9a2cc0-eea8-401d-85db-7824e2ba0463-config-data-custom\") pod \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\" (UID: \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\") " Feb 17 09:22:30 crc kubenswrapper[4848]: I0217 09:22:30.482207 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d9a2cc0-eea8-401d-85db-7824e2ba0463-scripts\") pod \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\" (UID: \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\") " Feb 17 09:22:30 crc kubenswrapper[4848]: I0217 09:22:30.482287 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9a2cc0-eea8-401d-85db-7824e2ba0463-config-data\") pod \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\" (UID: \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\") " Feb 17 09:22:30 crc kubenswrapper[4848]: I0217 09:22:30.482362 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d9a2cc0-eea8-401d-85db-7824e2ba0463-etc-machine-id\") pod \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\" (UID: \"6d9a2cc0-eea8-401d-85db-7824e2ba0463\") " Feb 17 09:22:30 crc kubenswrapper[4848]: I0217 09:22:30.483753 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6d9a2cc0-eea8-401d-85db-7824e2ba0463-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6d9a2cc0-eea8-401d-85db-7824e2ba0463" (UID: "6d9a2cc0-eea8-401d-85db-7824e2ba0463"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:22:30 crc kubenswrapper[4848]: I0217 09:22:30.489332 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d9a2cc0-eea8-401d-85db-7824e2ba0463-scripts" (OuterVolumeSpecName: "scripts") pod "6d9a2cc0-eea8-401d-85db-7824e2ba0463" (UID: "6d9a2cc0-eea8-401d-85db-7824e2ba0463"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:30 crc kubenswrapper[4848]: I0217 09:22:30.503984 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d9a2cc0-eea8-401d-85db-7824e2ba0463-kube-api-access-sqshj" (OuterVolumeSpecName: "kube-api-access-sqshj") pod "6d9a2cc0-eea8-401d-85db-7824e2ba0463" (UID: "6d9a2cc0-eea8-401d-85db-7824e2ba0463"). InnerVolumeSpecName "kube-api-access-sqshj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:22:30 crc kubenswrapper[4848]: I0217 09:22:30.505991 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d9a2cc0-eea8-401d-85db-7824e2ba0463-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "6d9a2cc0-eea8-401d-85db-7824e2ba0463" (UID: "6d9a2cc0-eea8-401d-85db-7824e2ba0463"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:30 crc kubenswrapper[4848]: I0217 09:22:30.548547 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d9a2cc0-eea8-401d-85db-7824e2ba0463-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6d9a2cc0-eea8-401d-85db-7824e2ba0463" (UID: "6d9a2cc0-eea8-401d-85db-7824e2ba0463"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:30 crc kubenswrapper[4848]: I0217 09:22:30.585849 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6d9a2cc0-eea8-401d-85db-7824e2ba0463-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:30 crc kubenswrapper[4848]: I0217 09:22:30.585882 4848 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6d9a2cc0-eea8-401d-85db-7824e2ba0463-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:30 crc kubenswrapper[4848]: I0217 09:22:30.585892 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d9a2cc0-eea8-401d-85db-7824e2ba0463-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:30 crc kubenswrapper[4848]: I0217 09:22:30.585902 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqshj\" (UniqueName: \"kubernetes.io/projected/6d9a2cc0-eea8-401d-85db-7824e2ba0463-kube-api-access-sqshj\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:30 crc kubenswrapper[4848]: I0217 09:22:30.585910 4848 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6d9a2cc0-eea8-401d-85db-7824e2ba0463-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:30 crc kubenswrapper[4848]: I0217 09:22:30.598864 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d9a2cc0-eea8-401d-85db-7824e2ba0463-config-data" (OuterVolumeSpecName: "config-data") pod "6d9a2cc0-eea8-401d-85db-7824e2ba0463" (UID: "6d9a2cc0-eea8-401d-85db-7824e2ba0463"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:30 crc kubenswrapper[4848]: I0217 09:22:30.687113 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6d9a2cc0-eea8-401d-85db-7824e2ba0463-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:30 crc kubenswrapper[4848]: I0217 09:22:30.946068 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"6d9a2cc0-eea8-401d-85db-7824e2ba0463","Type":"ContainerDied","Data":"55a876e45cb588c818891f03bde27abd76bd4925fba7772b6fba7107bde76f17"} Feb 17 09:22:30 crc kubenswrapper[4848]: I0217 09:22:30.946110 4848 scope.go:117] "RemoveContainer" containerID="14214e79f728cd5082aa7a41736e11aa5222d422b77d8384aa975c0ab485daed" Feb 17 09:22:30 crc kubenswrapper[4848]: I0217 09:22:30.946227 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 09:22:30 crc kubenswrapper[4848]: I0217 09:22:30.971818 4848 scope.go:117] "RemoveContainer" containerID="4513965a35748a27bcf0df3456b3d3d26b39d5aa26477a8740dd1e3c7c57b911" Feb 17 09:22:30 crc kubenswrapper[4848]: I0217 09:22:30.979425 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 09:22:30 crc kubenswrapper[4848]: I0217 09:22:30.986940 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.001507 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 09:22:31 crc kubenswrapper[4848]: E0217 09:22:31.001854 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8226a25-977f-4934-b40b-7504ab8f23e4" containerName="horizon" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.001871 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8226a25-977f-4934-b40b-7504ab8f23e4" containerName="horizon" Feb 17 09:22:31 crc kubenswrapper[4848]: E0217 09:22:31.001882 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72029462-f77a-48a6-8fcc-49d2e9ab7046" containerName="init" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.001888 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="72029462-f77a-48a6-8fcc-49d2e9ab7046" containerName="init" Feb 17 09:22:31 crc kubenswrapper[4848]: E0217 09:22:31.001905 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d790c2ab-67aa-4c46-9407-9fa991223dd0" containerName="horizon" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.001911 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="d790c2ab-67aa-4c46-9407-9fa991223dd0" containerName="horizon" Feb 17 09:22:31 crc kubenswrapper[4848]: E0217 09:22:31.001921 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adcdc585-f670-4bc3-be54-acf796d438df" containerName="horizon" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.001927 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="adcdc585-f670-4bc3-be54-acf796d438df" containerName="horizon" Feb 17 09:22:31 crc kubenswrapper[4848]: E0217 09:22:31.001940 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adcdc585-f670-4bc3-be54-acf796d438df" containerName="horizon-log" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.001945 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="adcdc585-f670-4bc3-be54-acf796d438df" containerName="horizon-log" Feb 17 09:22:31 crc kubenswrapper[4848]: E0217 09:22:31.001956 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d9a2cc0-eea8-401d-85db-7824e2ba0463" containerName="probe" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.001962 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d9a2cc0-eea8-401d-85db-7824e2ba0463" containerName="probe" Feb 17 09:22:31 crc kubenswrapper[4848]: E0217 09:22:31.001974 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e37f2d41-bace-4ca3-a811-36b44ee278d4" containerName="neutron-api" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.001979 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37f2d41-bace-4ca3-a811-36b44ee278d4" containerName="neutron-api" Feb 17 09:22:31 crc kubenswrapper[4848]: E0217 09:22:31.001990 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d9a2cc0-eea8-401d-85db-7824e2ba0463" containerName="cinder-scheduler" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.001996 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d9a2cc0-eea8-401d-85db-7824e2ba0463" containerName="cinder-scheduler" Feb 17 09:22:31 crc kubenswrapper[4848]: E0217 09:22:31.002007 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72029462-f77a-48a6-8fcc-49d2e9ab7046" containerName="dnsmasq-dns" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.002013 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="72029462-f77a-48a6-8fcc-49d2e9ab7046" containerName="dnsmasq-dns" Feb 17 09:22:31 crc kubenswrapper[4848]: E0217 09:22:31.002022 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d790c2ab-67aa-4c46-9407-9fa991223dd0" containerName="horizon-log" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.002030 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="d790c2ab-67aa-4c46-9407-9fa991223dd0" containerName="horizon-log" Feb 17 09:22:31 crc kubenswrapper[4848]: E0217 09:22:31.002041 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e37f2d41-bace-4ca3-a811-36b44ee278d4" containerName="neutron-httpd" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.002047 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="e37f2d41-bace-4ca3-a811-36b44ee278d4" containerName="neutron-httpd" Feb 17 09:22:31 crc kubenswrapper[4848]: E0217 09:22:31.002057 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8226a25-977f-4934-b40b-7504ab8f23e4" containerName="horizon-log" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.002062 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8226a25-977f-4934-b40b-7504ab8f23e4" containerName="horizon-log" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.002220 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8226a25-977f-4934-b40b-7504ab8f23e4" containerName="horizon-log" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.002230 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="adcdc585-f670-4bc3-be54-acf796d438df" containerName="horizon-log" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.002241 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="d790c2ab-67aa-4c46-9407-9fa991223dd0" containerName="horizon" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.002250 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="e37f2d41-bace-4ca3-a811-36b44ee278d4" containerName="neutron-httpd" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.002259 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="72029462-f77a-48a6-8fcc-49d2e9ab7046" containerName="dnsmasq-dns" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.002268 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d9a2cc0-eea8-401d-85db-7824e2ba0463" containerName="probe" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.002280 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d9a2cc0-eea8-401d-85db-7824e2ba0463" containerName="cinder-scheduler" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.002288 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="d790c2ab-67aa-4c46-9407-9fa991223dd0" containerName="horizon-log" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.002300 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8226a25-977f-4934-b40b-7504ab8f23e4" containerName="horizon" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.002308 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="e37f2d41-bace-4ca3-a811-36b44ee278d4" containerName="neutron-api" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.002319 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="adcdc585-f670-4bc3-be54-acf796d438df" containerName="horizon" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.009563 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.012702 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.014196 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.094045 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96a17dca-14f1-42ed-aca6-45fc15067cd3-scripts\") pod \"cinder-scheduler-0\" (UID: \"96a17dca-14f1-42ed-aca6-45fc15067cd3\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.094131 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkdj8\" (UniqueName: \"kubernetes.io/projected/96a17dca-14f1-42ed-aca6-45fc15067cd3-kube-api-access-nkdj8\") pod \"cinder-scheduler-0\" (UID: \"96a17dca-14f1-42ed-aca6-45fc15067cd3\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.094190 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96a17dca-14f1-42ed-aca6-45fc15067cd3-config-data\") pod \"cinder-scheduler-0\" (UID: \"96a17dca-14f1-42ed-aca6-45fc15067cd3\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.094242 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/96a17dca-14f1-42ed-aca6-45fc15067cd3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"96a17dca-14f1-42ed-aca6-45fc15067cd3\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.094299 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96a17dca-14f1-42ed-aca6-45fc15067cd3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"96a17dca-14f1-42ed-aca6-45fc15067cd3\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.094394 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96a17dca-14f1-42ed-aca6-45fc15067cd3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"96a17dca-14f1-42ed-aca6-45fc15067cd3\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.195862 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96a17dca-14f1-42ed-aca6-45fc15067cd3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"96a17dca-14f1-42ed-aca6-45fc15067cd3\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.196253 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96a17dca-14f1-42ed-aca6-45fc15067cd3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"96a17dca-14f1-42ed-aca6-45fc15067cd3\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.196277 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96a17dca-14f1-42ed-aca6-45fc15067cd3-scripts\") pod \"cinder-scheduler-0\" (UID: \"96a17dca-14f1-42ed-aca6-45fc15067cd3\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.196314 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkdj8\" (UniqueName: \"kubernetes.io/projected/96a17dca-14f1-42ed-aca6-45fc15067cd3-kube-api-access-nkdj8\") pod \"cinder-scheduler-0\" (UID: \"96a17dca-14f1-42ed-aca6-45fc15067cd3\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.196353 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96a17dca-14f1-42ed-aca6-45fc15067cd3-config-data\") pod \"cinder-scheduler-0\" (UID: \"96a17dca-14f1-42ed-aca6-45fc15067cd3\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.196389 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/96a17dca-14f1-42ed-aca6-45fc15067cd3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"96a17dca-14f1-42ed-aca6-45fc15067cd3\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.196513 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/96a17dca-14f1-42ed-aca6-45fc15067cd3-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"96a17dca-14f1-42ed-aca6-45fc15067cd3\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.199635 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/96a17dca-14f1-42ed-aca6-45fc15067cd3-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"96a17dca-14f1-42ed-aca6-45fc15067cd3\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.200427 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/96a17dca-14f1-42ed-aca6-45fc15067cd3-scripts\") pod \"cinder-scheduler-0\" (UID: \"96a17dca-14f1-42ed-aca6-45fc15067cd3\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.200445 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/96a17dca-14f1-42ed-aca6-45fc15067cd3-config-data\") pod \"cinder-scheduler-0\" (UID: \"96a17dca-14f1-42ed-aca6-45fc15067cd3\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.201361 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96a17dca-14f1-42ed-aca6-45fc15067cd3-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"96a17dca-14f1-42ed-aca6-45fc15067cd3\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.216276 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkdj8\" (UniqueName: \"kubernetes.io/projected/96a17dca-14f1-42ed-aca6-45fc15067cd3-kube-api-access-nkdj8\") pod \"cinder-scheduler-0\" (UID: \"96a17dca-14f1-42ed-aca6-45fc15067cd3\") " pod="openstack/cinder-scheduler-0" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.340316 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.404217 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d9a2cc0-eea8-401d-85db-7824e2ba0463" path="/var/lib/kubelet/pods/6d9a2cc0-eea8-401d-85db-7824e2ba0463/volumes" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.795268 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-749cc47784-q9crv" podUID="1068aa99-55d4-4778-ac02-b354de25d16e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.808803 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.954262 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"96a17dca-14f1-42ed-aca6-45fc15067cd3","Type":"ContainerStarted","Data":"1840278f0ae7b4a5da51e989ce9a48e934c24043d24da8b7a59295bd5a870435"} Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.959373 4848 generic.go:334] "Generic (PLEG): container finished" podID="1068aa99-55d4-4778-ac02-b354de25d16e" containerID="0cafaca3851618f0ffdb14a0cf40a0bb789ede4dd0cfff526ba29a414b7a3d5c" exitCode=0 Feb 17 09:22:31 crc kubenswrapper[4848]: I0217 09:22:31.959418 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-749cc47784-q9crv" event={"ID":"1068aa99-55d4-4778-ac02-b354de25d16e","Type":"ContainerDied","Data":"0cafaca3851618f0ffdb14a0cf40a0bb789ede4dd0cfff526ba29a414b7a3d5c"} Feb 17 09:22:32 crc kubenswrapper[4848]: I0217 09:22:32.949007 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:32 crc kubenswrapper[4848]: I0217 09:22:32.975711 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"96a17dca-14f1-42ed-aca6-45fc15067cd3","Type":"ContainerStarted","Data":"0ed0d56372a6e8f679c70166da3b12df4604178bda0c4f4ca3a4fed73ca335c7"} Feb 17 09:22:33 crc kubenswrapper[4848]: I0217 09:22:33.102321 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56c75f4b6d-vbll8" Feb 17 09:22:33 crc kubenswrapper[4848]: I0217 09:22:33.171129 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-65667fcd94-ngsp8"] Feb 17 09:22:33 crc kubenswrapper[4848]: I0217 09:22:33.171369 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-65667fcd94-ngsp8" podUID="5fbd606b-59f6-4ba4-8e62-d90235b987d4" containerName="barbican-api-log" containerID="cri-o://2abb4270db7b9df48b7fe1880e79f71943d782d3801ff1a452c0f6b93041ab2e" gracePeriod=30 Feb 17 09:22:33 crc kubenswrapper[4848]: I0217 09:22:33.171566 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-65667fcd94-ngsp8" podUID="5fbd606b-59f6-4ba4-8e62-d90235b987d4" containerName="barbican-api" containerID="cri-o://c3ecd8825a36de61f635e02b70fe33c7f6d63d35a1006ae84297ca4996c669b3" gracePeriod=30 Feb 17 09:22:33 crc kubenswrapper[4848]: I0217 09:22:33.544128 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 17 09:22:33 crc kubenswrapper[4848]: I0217 09:22:33.998319 4848 generic.go:334] "Generic (PLEG): container finished" podID="5fbd606b-59f6-4ba4-8e62-d90235b987d4" containerID="2abb4270db7b9df48b7fe1880e79f71943d782d3801ff1a452c0f6b93041ab2e" exitCode=143 Feb 17 09:22:33 crc kubenswrapper[4848]: I0217 09:22:33.998373 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65667fcd94-ngsp8" event={"ID":"5fbd606b-59f6-4ba4-8e62-d90235b987d4","Type":"ContainerDied","Data":"2abb4270db7b9df48b7fe1880e79f71943d782d3801ff1a452c0f6b93041ab2e"} Feb 17 09:22:34 crc kubenswrapper[4848]: I0217 09:22:34.022514 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"96a17dca-14f1-42ed-aca6-45fc15067cd3","Type":"ContainerStarted","Data":"7422e33f0b5e72aa59c79f29b1f3dc156dee1a20dd67b5346cd2737a18489e93"} Feb 17 09:22:34 crc kubenswrapper[4848]: I0217 09:22:34.048982 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.048969298 podStartE2EDuration="4.048969298s" podCreationTimestamp="2026-02-17 09:22:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:22:34.046220889 +0000 UTC m=+1031.589476535" watchObservedRunningTime="2026-02-17 09:22:34.048969298 +0000 UTC m=+1031.592224944" Feb 17 09:22:35 crc kubenswrapper[4848]: I0217 09:22:35.153223 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-76c7ffd8bf-x42cc" Feb 17 09:22:36 crc kubenswrapper[4848]: I0217 09:22:36.341313 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 17 09:22:36 crc kubenswrapper[4848]: I0217 09:22:36.358081 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-65667fcd94-ngsp8" podUID="5fbd606b-59f6-4ba4-8e62-d90235b987d4" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:58670->10.217.0.163:9311: read: connection reset by peer" Feb 17 09:22:36 crc kubenswrapper[4848]: I0217 09:22:36.358093 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-65667fcd94-ngsp8" podUID="5fbd606b-59f6-4ba4-8e62-d90235b987d4" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:58684->10.217.0.163:9311: read: connection reset by peer" Feb 17 09:22:36 crc kubenswrapper[4848]: I0217 09:22:36.788748 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-65667fcd94-ngsp8" Feb 17 09:22:36 crc kubenswrapper[4848]: I0217 09:22:36.902025 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fbd606b-59f6-4ba4-8e62-d90235b987d4-config-data\") pod \"5fbd606b-59f6-4ba4-8e62-d90235b987d4\" (UID: \"5fbd606b-59f6-4ba4-8e62-d90235b987d4\") " Feb 17 09:22:36 crc kubenswrapper[4848]: I0217 09:22:36.902155 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fbd606b-59f6-4ba4-8e62-d90235b987d4-combined-ca-bundle\") pod \"5fbd606b-59f6-4ba4-8e62-d90235b987d4\" (UID: \"5fbd606b-59f6-4ba4-8e62-d90235b987d4\") " Feb 17 09:22:36 crc kubenswrapper[4848]: I0217 09:22:36.902187 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fbd606b-59f6-4ba4-8e62-d90235b987d4-logs\") pod \"5fbd606b-59f6-4ba4-8e62-d90235b987d4\" (UID: \"5fbd606b-59f6-4ba4-8e62-d90235b987d4\") " Feb 17 09:22:36 crc kubenswrapper[4848]: I0217 09:22:36.902317 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5fbd606b-59f6-4ba4-8e62-d90235b987d4-config-data-custom\") pod \"5fbd606b-59f6-4ba4-8e62-d90235b987d4\" (UID: \"5fbd606b-59f6-4ba4-8e62-d90235b987d4\") " Feb 17 09:22:36 crc kubenswrapper[4848]: I0217 09:22:36.902356 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncs58\" (UniqueName: \"kubernetes.io/projected/5fbd606b-59f6-4ba4-8e62-d90235b987d4-kube-api-access-ncs58\") pod \"5fbd606b-59f6-4ba4-8e62-d90235b987d4\" (UID: \"5fbd606b-59f6-4ba4-8e62-d90235b987d4\") " Feb 17 09:22:36 crc kubenswrapper[4848]: I0217 09:22:36.903325 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fbd606b-59f6-4ba4-8e62-d90235b987d4-logs" (OuterVolumeSpecName: "logs") pod "5fbd606b-59f6-4ba4-8e62-d90235b987d4" (UID: "5fbd606b-59f6-4ba4-8e62-d90235b987d4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:22:36 crc kubenswrapper[4848]: I0217 09:22:36.907520 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fbd606b-59f6-4ba4-8e62-d90235b987d4-kube-api-access-ncs58" (OuterVolumeSpecName: "kube-api-access-ncs58") pod "5fbd606b-59f6-4ba4-8e62-d90235b987d4" (UID: "5fbd606b-59f6-4ba4-8e62-d90235b987d4"). InnerVolumeSpecName "kube-api-access-ncs58". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:22:36 crc kubenswrapper[4848]: I0217 09:22:36.910269 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fbd606b-59f6-4ba4-8e62-d90235b987d4-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "5fbd606b-59f6-4ba4-8e62-d90235b987d4" (UID: "5fbd606b-59f6-4ba4-8e62-d90235b987d4"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:36 crc kubenswrapper[4848]: I0217 09:22:36.940488 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fbd606b-59f6-4ba4-8e62-d90235b987d4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5fbd606b-59f6-4ba4-8e62-d90235b987d4" (UID: "5fbd606b-59f6-4ba4-8e62-d90235b987d4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:36 crc kubenswrapper[4848]: I0217 09:22:36.960442 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fbd606b-59f6-4ba4-8e62-d90235b987d4-config-data" (OuterVolumeSpecName: "config-data") pod "5fbd606b-59f6-4ba4-8e62-d90235b987d4" (UID: "5fbd606b-59f6-4ba4-8e62-d90235b987d4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:37 crc kubenswrapper[4848]: I0217 09:22:37.004416 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fbd606b-59f6-4ba4-8e62-d90235b987d4-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:37 crc kubenswrapper[4848]: I0217 09:22:37.004445 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fbd606b-59f6-4ba4-8e62-d90235b987d4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:37 crc kubenswrapper[4848]: I0217 09:22:37.004457 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fbd606b-59f6-4ba4-8e62-d90235b987d4-logs\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:37 crc kubenswrapper[4848]: I0217 09:22:37.004467 4848 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/5fbd606b-59f6-4ba4-8e62-d90235b987d4-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:37 crc kubenswrapper[4848]: I0217 09:22:37.004476 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncs58\" (UniqueName: \"kubernetes.io/projected/5fbd606b-59f6-4ba4-8e62-d90235b987d4-kube-api-access-ncs58\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:37 crc kubenswrapper[4848]: I0217 09:22:37.051084 4848 generic.go:334] "Generic (PLEG): container finished" podID="5fbd606b-59f6-4ba4-8e62-d90235b987d4" containerID="c3ecd8825a36de61f635e02b70fe33c7f6d63d35a1006ae84297ca4996c669b3" exitCode=0 Feb 17 09:22:37 crc kubenswrapper[4848]: I0217 09:22:37.051129 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65667fcd94-ngsp8" event={"ID":"5fbd606b-59f6-4ba4-8e62-d90235b987d4","Type":"ContainerDied","Data":"c3ecd8825a36de61f635e02b70fe33c7f6d63d35a1006ae84297ca4996c669b3"} Feb 17 09:22:37 crc kubenswrapper[4848]: I0217 09:22:37.051159 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-65667fcd94-ngsp8" event={"ID":"5fbd606b-59f6-4ba4-8e62-d90235b987d4","Type":"ContainerDied","Data":"379431ab4949f910acbf9a00dc4eb17b6386de3783ab8f2d248a89617877e8bc"} Feb 17 09:22:37 crc kubenswrapper[4848]: I0217 09:22:37.051176 4848 scope.go:117] "RemoveContainer" containerID="c3ecd8825a36de61f635e02b70fe33c7f6d63d35a1006ae84297ca4996c669b3" Feb 17 09:22:37 crc kubenswrapper[4848]: I0217 09:22:37.051300 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-65667fcd94-ngsp8" Feb 17 09:22:37 crc kubenswrapper[4848]: I0217 09:22:37.082110 4848 scope.go:117] "RemoveContainer" containerID="2abb4270db7b9df48b7fe1880e79f71943d782d3801ff1a452c0f6b93041ab2e" Feb 17 09:22:37 crc kubenswrapper[4848]: I0217 09:22:37.095357 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-65667fcd94-ngsp8"] Feb 17 09:22:37 crc kubenswrapper[4848]: I0217 09:22:37.100200 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-65667fcd94-ngsp8"] Feb 17 09:22:37 crc kubenswrapper[4848]: I0217 09:22:37.105806 4848 scope.go:117] "RemoveContainer" containerID="c3ecd8825a36de61f635e02b70fe33c7f6d63d35a1006ae84297ca4996c669b3" Feb 17 09:22:37 crc kubenswrapper[4848]: E0217 09:22:37.106408 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3ecd8825a36de61f635e02b70fe33c7f6d63d35a1006ae84297ca4996c669b3\": container with ID starting with c3ecd8825a36de61f635e02b70fe33c7f6d63d35a1006ae84297ca4996c669b3 not found: ID does not exist" containerID="c3ecd8825a36de61f635e02b70fe33c7f6d63d35a1006ae84297ca4996c669b3" Feb 17 09:22:37 crc kubenswrapper[4848]: I0217 09:22:37.106458 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3ecd8825a36de61f635e02b70fe33c7f6d63d35a1006ae84297ca4996c669b3"} err="failed to get container status \"c3ecd8825a36de61f635e02b70fe33c7f6d63d35a1006ae84297ca4996c669b3\": rpc error: code = NotFound desc = could not find container \"c3ecd8825a36de61f635e02b70fe33c7f6d63d35a1006ae84297ca4996c669b3\": container with ID starting with c3ecd8825a36de61f635e02b70fe33c7f6d63d35a1006ae84297ca4996c669b3 not found: ID does not exist" Feb 17 09:22:37 crc kubenswrapper[4848]: I0217 09:22:37.106508 4848 scope.go:117] "RemoveContainer" containerID="2abb4270db7b9df48b7fe1880e79f71943d782d3801ff1a452c0f6b93041ab2e" Feb 17 09:22:37 crc kubenswrapper[4848]: E0217 09:22:37.106895 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2abb4270db7b9df48b7fe1880e79f71943d782d3801ff1a452c0f6b93041ab2e\": container with ID starting with 2abb4270db7b9df48b7fe1880e79f71943d782d3801ff1a452c0f6b93041ab2e not found: ID does not exist" containerID="2abb4270db7b9df48b7fe1880e79f71943d782d3801ff1a452c0f6b93041ab2e" Feb 17 09:22:37 crc kubenswrapper[4848]: I0217 09:22:37.106941 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2abb4270db7b9df48b7fe1880e79f71943d782d3801ff1a452c0f6b93041ab2e"} err="failed to get container status \"2abb4270db7b9df48b7fe1880e79f71943d782d3801ff1a452c0f6b93041ab2e\": rpc error: code = NotFound desc = could not find container \"2abb4270db7b9df48b7fe1880e79f71943d782d3801ff1a452c0f6b93041ab2e\": container with ID starting with 2abb4270db7b9df48b7fe1880e79f71943d782d3801ff1a452c0f6b93041ab2e not found: ID does not exist" Feb 17 09:22:37 crc kubenswrapper[4848]: I0217 09:22:37.395434 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fbd606b-59f6-4ba4-8e62-d90235b987d4" path="/var/lib/kubelet/pods/5fbd606b-59f6-4ba4-8e62-d90235b987d4/volumes" Feb 17 09:22:38 crc kubenswrapper[4848]: I0217 09:22:38.829691 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 17 09:22:38 crc kubenswrapper[4848]: E0217 09:22:38.830142 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fbd606b-59f6-4ba4-8e62-d90235b987d4" containerName="barbican-api" Feb 17 09:22:38 crc kubenswrapper[4848]: I0217 09:22:38.830157 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fbd606b-59f6-4ba4-8e62-d90235b987d4" containerName="barbican-api" Feb 17 09:22:38 crc kubenswrapper[4848]: E0217 09:22:38.830196 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fbd606b-59f6-4ba4-8e62-d90235b987d4" containerName="barbican-api-log" Feb 17 09:22:38 crc kubenswrapper[4848]: I0217 09:22:38.830202 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fbd606b-59f6-4ba4-8e62-d90235b987d4" containerName="barbican-api-log" Feb 17 09:22:38 crc kubenswrapper[4848]: I0217 09:22:38.830362 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fbd606b-59f6-4ba4-8e62-d90235b987d4" containerName="barbican-api" Feb 17 09:22:38 crc kubenswrapper[4848]: I0217 09:22:38.830385 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fbd606b-59f6-4ba4-8e62-d90235b987d4" containerName="barbican-api-log" Feb 17 09:22:38 crc kubenswrapper[4848]: I0217 09:22:38.830957 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 09:22:38 crc kubenswrapper[4848]: I0217 09:22:38.836569 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 17 09:22:38 crc kubenswrapper[4848]: I0217 09:22:38.837072 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 17 09:22:38 crc kubenswrapper[4848]: I0217 09:22:38.846094 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-bntnb" Feb 17 09:22:38 crc kubenswrapper[4848]: I0217 09:22:38.851326 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 17 09:22:38 crc kubenswrapper[4848]: I0217 09:22:38.935836 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5211bb87-9d50-485f-aa61-43f8d57339c7-openstack-config\") pod \"openstackclient\" (UID: \"5211bb87-9d50-485f-aa61-43f8d57339c7\") " pod="openstack/openstackclient" Feb 17 09:22:38 crc kubenswrapper[4848]: I0217 09:22:38.935892 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5211bb87-9d50-485f-aa61-43f8d57339c7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5211bb87-9d50-485f-aa61-43f8d57339c7\") " pod="openstack/openstackclient" Feb 17 09:22:38 crc kubenswrapper[4848]: I0217 09:22:38.935938 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5211bb87-9d50-485f-aa61-43f8d57339c7-openstack-config-secret\") pod \"openstackclient\" (UID: \"5211bb87-9d50-485f-aa61-43f8d57339c7\") " pod="openstack/openstackclient" Feb 17 09:22:38 crc kubenswrapper[4848]: I0217 09:22:38.935969 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g768n\" (UniqueName: \"kubernetes.io/projected/5211bb87-9d50-485f-aa61-43f8d57339c7-kube-api-access-g768n\") pod \"openstackclient\" (UID: \"5211bb87-9d50-485f-aa61-43f8d57339c7\") " pod="openstack/openstackclient" Feb 17 09:22:39 crc kubenswrapper[4848]: I0217 09:22:39.037742 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g768n\" (UniqueName: \"kubernetes.io/projected/5211bb87-9d50-485f-aa61-43f8d57339c7-kube-api-access-g768n\") pod \"openstackclient\" (UID: \"5211bb87-9d50-485f-aa61-43f8d57339c7\") " pod="openstack/openstackclient" Feb 17 09:22:39 crc kubenswrapper[4848]: I0217 09:22:39.037890 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5211bb87-9d50-485f-aa61-43f8d57339c7-openstack-config\") pod \"openstackclient\" (UID: \"5211bb87-9d50-485f-aa61-43f8d57339c7\") " pod="openstack/openstackclient" Feb 17 09:22:39 crc kubenswrapper[4848]: I0217 09:22:39.037932 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5211bb87-9d50-485f-aa61-43f8d57339c7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5211bb87-9d50-485f-aa61-43f8d57339c7\") " pod="openstack/openstackclient" Feb 17 09:22:39 crc kubenswrapper[4848]: I0217 09:22:39.037994 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5211bb87-9d50-485f-aa61-43f8d57339c7-openstack-config-secret\") pod \"openstackclient\" (UID: \"5211bb87-9d50-485f-aa61-43f8d57339c7\") " pod="openstack/openstackclient" Feb 17 09:22:39 crc kubenswrapper[4848]: I0217 09:22:39.038834 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5211bb87-9d50-485f-aa61-43f8d57339c7-openstack-config\") pod \"openstackclient\" (UID: \"5211bb87-9d50-485f-aa61-43f8d57339c7\") " pod="openstack/openstackclient" Feb 17 09:22:39 crc kubenswrapper[4848]: I0217 09:22:39.044133 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5211bb87-9d50-485f-aa61-43f8d57339c7-openstack-config-secret\") pod \"openstackclient\" (UID: \"5211bb87-9d50-485f-aa61-43f8d57339c7\") " pod="openstack/openstackclient" Feb 17 09:22:39 crc kubenswrapper[4848]: I0217 09:22:39.045004 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5211bb87-9d50-485f-aa61-43f8d57339c7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5211bb87-9d50-485f-aa61-43f8d57339c7\") " pod="openstack/openstackclient" Feb 17 09:22:39 crc kubenswrapper[4848]: I0217 09:22:39.060622 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g768n\" (UniqueName: \"kubernetes.io/projected/5211bb87-9d50-485f-aa61-43f8d57339c7-kube-api-access-g768n\") pod \"openstackclient\" (UID: \"5211bb87-9d50-485f-aa61-43f8d57339c7\") " pod="openstack/openstackclient" Feb 17 09:22:39 crc kubenswrapper[4848]: I0217 09:22:39.148906 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 17 09:22:39 crc kubenswrapper[4848]: I0217 09:22:39.682076 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 17 09:22:40 crc kubenswrapper[4848]: I0217 09:22:40.081528 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5211bb87-9d50-485f-aa61-43f8d57339c7","Type":"ContainerStarted","Data":"c96f4c9380fc921b8790d57f87189b2b32008e47010cb3376922a60863ff3fc4"} Feb 17 09:22:41 crc kubenswrapper[4848]: I0217 09:22:41.666395 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 17 09:22:41 crc kubenswrapper[4848]: I0217 09:22:41.795446 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-749cc47784-q9crv" podUID="1068aa99-55d4-4778-ac02-b354de25d16e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Feb 17 09:22:41 crc kubenswrapper[4848]: I0217 09:22:41.848836 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:22:41 crc kubenswrapper[4848]: I0217 09:22:41.849380 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65acfebd-febe-42db-8441-065b06724681" containerName="ceilometer-central-agent" containerID="cri-o://e7152b88ec933da1a2c6d7bc7a639190e9aeb9bc89fdcce26d10c9b357f6a5e8" gracePeriod=30 Feb 17 09:22:41 crc kubenswrapper[4848]: I0217 09:22:41.849497 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65acfebd-febe-42db-8441-065b06724681" containerName="sg-core" containerID="cri-o://fe2fb4796981bf23222fd2c45c8056b0de9d974652b09f75edd5fe50e3b32922" gracePeriod=30 Feb 17 09:22:41 crc kubenswrapper[4848]: I0217 09:22:41.849680 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65acfebd-febe-42db-8441-065b06724681" containerName="proxy-httpd" containerID="cri-o://4ef48c22d7d4092e05fc05a711d1666f0a05c60d1689ef9d76ffb2bd836ab3de" gracePeriod=30 Feb 17 09:22:41 crc kubenswrapper[4848]: I0217 09:22:41.849769 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="65acfebd-febe-42db-8441-065b06724681" containerName="ceilometer-notification-agent" containerID="cri-o://33cb62653bf607b598e80d536c84915fc8ebed23b57517a1fe4ae00d0262d473" gracePeriod=30 Feb 17 09:22:41 crc kubenswrapper[4848]: I0217 09:22:41.867726 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="65acfebd-febe-42db-8441-065b06724681" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.169:3000/\": EOF" Feb 17 09:22:42 crc kubenswrapper[4848]: I0217 09:22:42.102556 4848 generic.go:334] "Generic (PLEG): container finished" podID="65acfebd-febe-42db-8441-065b06724681" containerID="4ef48c22d7d4092e05fc05a711d1666f0a05c60d1689ef9d76ffb2bd836ab3de" exitCode=0 Feb 17 09:22:42 crc kubenswrapper[4848]: I0217 09:22:42.102588 4848 generic.go:334] "Generic (PLEG): container finished" podID="65acfebd-febe-42db-8441-065b06724681" containerID="fe2fb4796981bf23222fd2c45c8056b0de9d974652b09f75edd5fe50e3b32922" exitCode=2 Feb 17 09:22:42 crc kubenswrapper[4848]: I0217 09:22:42.102605 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65acfebd-febe-42db-8441-065b06724681","Type":"ContainerDied","Data":"4ef48c22d7d4092e05fc05a711d1666f0a05c60d1689ef9d76ffb2bd836ab3de"} Feb 17 09:22:42 crc kubenswrapper[4848]: I0217 09:22:42.102630 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65acfebd-febe-42db-8441-065b06724681","Type":"ContainerDied","Data":"fe2fb4796981bf23222fd2c45c8056b0de9d974652b09f75edd5fe50e3b32922"} Feb 17 09:22:43 crc kubenswrapper[4848]: I0217 09:22:43.114546 4848 generic.go:334] "Generic (PLEG): container finished" podID="65acfebd-febe-42db-8441-065b06724681" containerID="e7152b88ec933da1a2c6d7bc7a639190e9aeb9bc89fdcce26d10c9b357f6a5e8" exitCode=0 Feb 17 09:22:43 crc kubenswrapper[4848]: I0217 09:22:43.114732 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65acfebd-febe-42db-8441-065b06724681","Type":"ContainerDied","Data":"e7152b88ec933da1a2c6d7bc7a639190e9aeb9bc89fdcce26d10c9b357f6a5e8"} Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.356177 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-f467db6f-7x6cx"] Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.370878 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.376902 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.378140 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.378368 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.398914 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-f467db6f-7x6cx"] Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.469407 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abcdb3d8-da38-472a-bdb3-e1615f832970-public-tls-certs\") pod \"swift-proxy-f467db6f-7x6cx\" (UID: \"abcdb3d8-da38-472a-bdb3-e1615f832970\") " pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.469501 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abcdb3d8-da38-472a-bdb3-e1615f832970-config-data\") pod \"swift-proxy-f467db6f-7x6cx\" (UID: \"abcdb3d8-da38-472a-bdb3-e1615f832970\") " pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.469572 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/abcdb3d8-da38-472a-bdb3-e1615f832970-etc-swift\") pod \"swift-proxy-f467db6f-7x6cx\" (UID: \"abcdb3d8-da38-472a-bdb3-e1615f832970\") " pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.469592 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/abcdb3d8-da38-472a-bdb3-e1615f832970-log-httpd\") pod \"swift-proxy-f467db6f-7x6cx\" (UID: \"abcdb3d8-da38-472a-bdb3-e1615f832970\") " pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.469627 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abcdb3d8-da38-472a-bdb3-e1615f832970-combined-ca-bundle\") pod \"swift-proxy-f467db6f-7x6cx\" (UID: \"abcdb3d8-da38-472a-bdb3-e1615f832970\") " pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.469660 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abcdb3d8-da38-472a-bdb3-e1615f832970-internal-tls-certs\") pod \"swift-proxy-f467db6f-7x6cx\" (UID: \"abcdb3d8-da38-472a-bdb3-e1615f832970\") " pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.469684 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rhsw\" (UniqueName: \"kubernetes.io/projected/abcdb3d8-da38-472a-bdb3-e1615f832970-kube-api-access-9rhsw\") pod \"swift-proxy-f467db6f-7x6cx\" (UID: \"abcdb3d8-da38-472a-bdb3-e1615f832970\") " pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.469700 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/abcdb3d8-da38-472a-bdb3-e1615f832970-run-httpd\") pod \"swift-proxy-f467db6f-7x6cx\" (UID: \"abcdb3d8-da38-472a-bdb3-e1615f832970\") " pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.571163 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/abcdb3d8-da38-472a-bdb3-e1615f832970-etc-swift\") pod \"swift-proxy-f467db6f-7x6cx\" (UID: \"abcdb3d8-da38-472a-bdb3-e1615f832970\") " pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.571210 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/abcdb3d8-da38-472a-bdb3-e1615f832970-log-httpd\") pod \"swift-proxy-f467db6f-7x6cx\" (UID: \"abcdb3d8-da38-472a-bdb3-e1615f832970\") " pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.571238 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abcdb3d8-da38-472a-bdb3-e1615f832970-combined-ca-bundle\") pod \"swift-proxy-f467db6f-7x6cx\" (UID: \"abcdb3d8-da38-472a-bdb3-e1615f832970\") " pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.571269 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abcdb3d8-da38-472a-bdb3-e1615f832970-internal-tls-certs\") pod \"swift-proxy-f467db6f-7x6cx\" (UID: \"abcdb3d8-da38-472a-bdb3-e1615f832970\") " pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.571291 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rhsw\" (UniqueName: \"kubernetes.io/projected/abcdb3d8-da38-472a-bdb3-e1615f832970-kube-api-access-9rhsw\") pod \"swift-proxy-f467db6f-7x6cx\" (UID: \"abcdb3d8-da38-472a-bdb3-e1615f832970\") " pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.571310 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/abcdb3d8-da38-472a-bdb3-e1615f832970-run-httpd\") pod \"swift-proxy-f467db6f-7x6cx\" (UID: \"abcdb3d8-da38-472a-bdb3-e1615f832970\") " pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.571382 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abcdb3d8-da38-472a-bdb3-e1615f832970-public-tls-certs\") pod \"swift-proxy-f467db6f-7x6cx\" (UID: \"abcdb3d8-da38-472a-bdb3-e1615f832970\") " pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.571405 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abcdb3d8-da38-472a-bdb3-e1615f832970-config-data\") pod \"swift-proxy-f467db6f-7x6cx\" (UID: \"abcdb3d8-da38-472a-bdb3-e1615f832970\") " pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.572044 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/abcdb3d8-da38-472a-bdb3-e1615f832970-log-httpd\") pod \"swift-proxy-f467db6f-7x6cx\" (UID: \"abcdb3d8-da38-472a-bdb3-e1615f832970\") " pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.572111 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/abcdb3d8-da38-472a-bdb3-e1615f832970-run-httpd\") pod \"swift-proxy-f467db6f-7x6cx\" (UID: \"abcdb3d8-da38-472a-bdb3-e1615f832970\") " pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.576723 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/abcdb3d8-da38-472a-bdb3-e1615f832970-etc-swift\") pod \"swift-proxy-f467db6f-7x6cx\" (UID: \"abcdb3d8-da38-472a-bdb3-e1615f832970\") " pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.579482 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/abcdb3d8-da38-472a-bdb3-e1615f832970-internal-tls-certs\") pod \"swift-proxy-f467db6f-7x6cx\" (UID: \"abcdb3d8-da38-472a-bdb3-e1615f832970\") " pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.582052 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/abcdb3d8-da38-472a-bdb3-e1615f832970-combined-ca-bundle\") pod \"swift-proxy-f467db6f-7x6cx\" (UID: \"abcdb3d8-da38-472a-bdb3-e1615f832970\") " pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.589711 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/abcdb3d8-da38-472a-bdb3-e1615f832970-config-data\") pod \"swift-proxy-f467db6f-7x6cx\" (UID: \"abcdb3d8-da38-472a-bdb3-e1615f832970\") " pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.589917 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/abcdb3d8-da38-472a-bdb3-e1615f832970-public-tls-certs\") pod \"swift-proxy-f467db6f-7x6cx\" (UID: \"abcdb3d8-da38-472a-bdb3-e1615f832970\") " pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.591521 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rhsw\" (UniqueName: \"kubernetes.io/projected/abcdb3d8-da38-472a-bdb3-e1615f832970-kube-api-access-9rhsw\") pod \"swift-proxy-f467db6f-7x6cx\" (UID: \"abcdb3d8-da38-472a-bdb3-e1615f832970\") " pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:45 crc kubenswrapper[4848]: I0217 09:22:45.694109 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:46 crc kubenswrapper[4848]: I0217 09:22:46.207580 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:46 crc kubenswrapper[4848]: I0217 09:22:46.208640 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7b6cdb54-tkxbl" Feb 17 09:22:47 crc kubenswrapper[4848]: I0217 09:22:47.153158 4848 generic.go:334] "Generic (PLEG): container finished" podID="65acfebd-febe-42db-8441-065b06724681" containerID="33cb62653bf607b598e80d536c84915fc8ebed23b57517a1fe4ae00d0262d473" exitCode=0 Feb 17 09:22:47 crc kubenswrapper[4848]: I0217 09:22:47.153243 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65acfebd-febe-42db-8441-065b06724681","Type":"ContainerDied","Data":"33cb62653bf607b598e80d536c84915fc8ebed23b57517a1fe4ae00d0262d473"} Feb 17 09:22:48 crc kubenswrapper[4848]: I0217 09:22:48.771982 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:22:48 crc kubenswrapper[4848]: I0217 09:22:48.772292 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:22:48 crc kubenswrapper[4848]: I0217 09:22:48.772343 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:22:48 crc kubenswrapper[4848]: I0217 09:22:48.773143 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"394ea7e530ba2c06a8e2c57a8f43255e3afc07d0c7f59b99a48b84ecd7fdc2a0"} pod="openshift-machine-config-operator/machine-config-daemon-stvnz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 09:22:48 crc kubenswrapper[4848]: I0217 09:22:48.773193 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" containerID="cri-o://394ea7e530ba2c06a8e2c57a8f43255e3afc07d0c7f59b99a48b84ecd7fdc2a0" gracePeriod=600 Feb 17 09:22:49 crc kubenswrapper[4848]: I0217 09:22:49.179346 4848 generic.go:334] "Generic (PLEG): container finished" podID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerID="394ea7e530ba2c06a8e2c57a8f43255e3afc07d0c7f59b99a48b84ecd7fdc2a0" exitCode=0 Feb 17 09:22:49 crc kubenswrapper[4848]: I0217 09:22:49.179420 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerDied","Data":"394ea7e530ba2c06a8e2c57a8f43255e3afc07d0c7f59b99a48b84ecd7fdc2a0"} Feb 17 09:22:49 crc kubenswrapper[4848]: I0217 09:22:49.179685 4848 scope.go:117] "RemoveContainer" containerID="cf7c734d597165a992ca275dfa403ad67456d929b1c93b35482f6a777604c954" Feb 17 09:22:50 crc kubenswrapper[4848]: I0217 09:22:50.363170 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:22:50 crc kubenswrapper[4848]: I0217 09:22:50.563652 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65acfebd-febe-42db-8441-065b06724681-log-httpd\") pod \"65acfebd-febe-42db-8441-065b06724681\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " Feb 17 09:22:50 crc kubenswrapper[4848]: I0217 09:22:50.564011 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65acfebd-febe-42db-8441-065b06724681-scripts\") pod \"65acfebd-febe-42db-8441-065b06724681\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " Feb 17 09:22:50 crc kubenswrapper[4848]: I0217 09:22:50.564067 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65acfebd-febe-42db-8441-065b06724681-combined-ca-bundle\") pod \"65acfebd-febe-42db-8441-065b06724681\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " Feb 17 09:22:50 crc kubenswrapper[4848]: I0217 09:22:50.564106 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdjpw\" (UniqueName: \"kubernetes.io/projected/65acfebd-febe-42db-8441-065b06724681-kube-api-access-qdjpw\") pod \"65acfebd-febe-42db-8441-065b06724681\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " Feb 17 09:22:50 crc kubenswrapper[4848]: I0217 09:22:50.564740 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65acfebd-febe-42db-8441-065b06724681-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "65acfebd-febe-42db-8441-065b06724681" (UID: "65acfebd-febe-42db-8441-065b06724681"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:22:50 crc kubenswrapper[4848]: I0217 09:22:50.564799 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65acfebd-febe-42db-8441-065b06724681-config-data\") pod \"65acfebd-febe-42db-8441-065b06724681\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " Feb 17 09:22:50 crc kubenswrapper[4848]: I0217 09:22:50.564903 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65acfebd-febe-42db-8441-065b06724681-sg-core-conf-yaml\") pod \"65acfebd-febe-42db-8441-065b06724681\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " Feb 17 09:22:50 crc kubenswrapper[4848]: I0217 09:22:50.564938 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65acfebd-febe-42db-8441-065b06724681-run-httpd\") pod \"65acfebd-febe-42db-8441-065b06724681\" (UID: \"65acfebd-febe-42db-8441-065b06724681\") " Feb 17 09:22:50 crc kubenswrapper[4848]: I0217 09:22:50.565458 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65acfebd-febe-42db-8441-065b06724681-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "65acfebd-febe-42db-8441-065b06724681" (UID: "65acfebd-febe-42db-8441-065b06724681"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:22:50 crc kubenswrapper[4848]: I0217 09:22:50.566908 4848 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65acfebd-febe-42db-8441-065b06724681-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:50 crc kubenswrapper[4848]: I0217 09:22:50.567725 4848 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65acfebd-febe-42db-8441-065b06724681-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:50 crc kubenswrapper[4848]: I0217 09:22:50.570915 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65acfebd-febe-42db-8441-065b06724681-scripts" (OuterVolumeSpecName: "scripts") pod "65acfebd-febe-42db-8441-065b06724681" (UID: "65acfebd-febe-42db-8441-065b06724681"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:50 crc kubenswrapper[4848]: I0217 09:22:50.572154 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65acfebd-febe-42db-8441-065b06724681-kube-api-access-qdjpw" (OuterVolumeSpecName: "kube-api-access-qdjpw") pod "65acfebd-febe-42db-8441-065b06724681" (UID: "65acfebd-febe-42db-8441-065b06724681"). InnerVolumeSpecName "kube-api-access-qdjpw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:22:50 crc kubenswrapper[4848]: I0217 09:22:50.604921 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65acfebd-febe-42db-8441-065b06724681-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "65acfebd-febe-42db-8441-065b06724681" (UID: "65acfebd-febe-42db-8441-065b06724681"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:50 crc kubenswrapper[4848]: I0217 09:22:50.637869 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65acfebd-febe-42db-8441-065b06724681-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65acfebd-febe-42db-8441-065b06724681" (UID: "65acfebd-febe-42db-8441-065b06724681"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:50 crc kubenswrapper[4848]: I0217 09:22:50.669920 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65acfebd-febe-42db-8441-065b06724681-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:50 crc kubenswrapper[4848]: I0217 09:22:50.670203 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdjpw\" (UniqueName: \"kubernetes.io/projected/65acfebd-febe-42db-8441-065b06724681-kube-api-access-qdjpw\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:50 crc kubenswrapper[4848]: I0217 09:22:50.670287 4848 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65acfebd-febe-42db-8441-065b06724681-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:50 crc kubenswrapper[4848]: I0217 09:22:50.670351 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65acfebd-febe-42db-8441-065b06724681-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:50 crc kubenswrapper[4848]: I0217 09:22:50.672924 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65acfebd-febe-42db-8441-065b06724681-config-data" (OuterVolumeSpecName: "config-data") pod "65acfebd-febe-42db-8441-065b06724681" (UID: "65acfebd-febe-42db-8441-065b06724681"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:50 crc kubenswrapper[4848]: I0217 09:22:50.697065 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-f467db6f-7x6cx"] Feb 17 09:22:50 crc kubenswrapper[4848]: W0217 09:22:50.697874 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podabcdb3d8_da38_472a_bdb3_e1615f832970.slice/crio-5eefc97574ca4bfc849df0de6c4029d5f9434b743209bc05ad43d2887afc7b3b WatchSource:0}: Error finding container 5eefc97574ca4bfc849df0de6c4029d5f9434b743209bc05ad43d2887afc7b3b: Status 404 returned error can't find the container with id 5eefc97574ca4bfc849df0de6c4029d5f9434b743209bc05ad43d2887afc7b3b Feb 17 09:22:50 crc kubenswrapper[4848]: I0217 09:22:50.771677 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65acfebd-febe-42db-8441-065b06724681-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.216344 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerStarted","Data":"42389abaf9cd91ca5ab94c566a487ee2516f882d6f353baa7369223b1e0966e6"} Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.220836 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-f467db6f-7x6cx" event={"ID":"abcdb3d8-da38-472a-bdb3-e1615f832970","Type":"ContainerStarted","Data":"37c9fcd32ceaee47261f3e698d5fff0645ef32679d94f5ebfce4058852a46ea9"} Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.220879 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-f467db6f-7x6cx" event={"ID":"abcdb3d8-da38-472a-bdb3-e1615f832970","Type":"ContainerStarted","Data":"22a1f9075681e6a9f3e82984650fb2e81f51472885da47aa0605c14b87aaca52"} Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.220891 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-f467db6f-7x6cx" event={"ID":"abcdb3d8-da38-472a-bdb3-e1615f832970","Type":"ContainerStarted","Data":"5eefc97574ca4bfc849df0de6c4029d5f9434b743209bc05ad43d2887afc7b3b"} Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.220956 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.223448 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.223941 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65acfebd-febe-42db-8441-065b06724681","Type":"ContainerDied","Data":"1d4d95c874b8c554543aa4564d519795eb8b53eac2ff74e3257f1ee86205096b"} Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.223973 4848 scope.go:117] "RemoveContainer" containerID="4ef48c22d7d4092e05fc05a711d1666f0a05c60d1689ef9d76ffb2bd836ab3de" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.229338 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5211bb87-9d50-485f-aa61-43f8d57339c7","Type":"ContainerStarted","Data":"d386075a6964fc61355d8d42a41af90f10abe9a2af43fccf6c2247ab38b6e718"} Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.249705 4848 scope.go:117] "RemoveContainer" containerID="fe2fb4796981bf23222fd2c45c8056b0de9d974652b09f75edd5fe50e3b32922" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.279161 4848 scope.go:117] "RemoveContainer" containerID="33cb62653bf607b598e80d536c84915fc8ebed23b57517a1fe4ae00d0262d473" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.295833 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.308921 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.318779 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.8827338620000003 podStartE2EDuration="13.318746698s" podCreationTimestamp="2026-02-17 09:22:38 +0000 UTC" firstStartedPulling="2026-02-17 09:22:39.691351685 +0000 UTC m=+1037.234607331" lastFinishedPulling="2026-02-17 09:22:50.127364521 +0000 UTC m=+1047.670620167" observedRunningTime="2026-02-17 09:22:51.314059532 +0000 UTC m=+1048.857315178" watchObservedRunningTime="2026-02-17 09:22:51.318746698 +0000 UTC m=+1048.862002344" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.330918 4848 scope.go:117] "RemoveContainer" containerID="e7152b88ec933da1a2c6d7bc7a639190e9aeb9bc89fdcce26d10c9b357f6a5e8" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.353451 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:22:51 crc kubenswrapper[4848]: E0217 09:22:51.353821 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65acfebd-febe-42db-8441-065b06724681" containerName="ceilometer-notification-agent" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.353836 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="65acfebd-febe-42db-8441-065b06724681" containerName="ceilometer-notification-agent" Feb 17 09:22:51 crc kubenswrapper[4848]: E0217 09:22:51.353860 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65acfebd-febe-42db-8441-065b06724681" containerName="proxy-httpd" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.353867 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="65acfebd-febe-42db-8441-065b06724681" containerName="proxy-httpd" Feb 17 09:22:51 crc kubenswrapper[4848]: E0217 09:22:51.353874 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65acfebd-febe-42db-8441-065b06724681" containerName="sg-core" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.353882 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="65acfebd-febe-42db-8441-065b06724681" containerName="sg-core" Feb 17 09:22:51 crc kubenswrapper[4848]: E0217 09:22:51.353903 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65acfebd-febe-42db-8441-065b06724681" containerName="ceilometer-central-agent" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.353909 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="65acfebd-febe-42db-8441-065b06724681" containerName="ceilometer-central-agent" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.354071 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="65acfebd-febe-42db-8441-065b06724681" containerName="ceilometer-central-agent" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.354084 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="65acfebd-febe-42db-8441-065b06724681" containerName="ceilometer-notification-agent" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.354096 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="65acfebd-febe-42db-8441-065b06724681" containerName="sg-core" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.354112 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="65acfebd-febe-42db-8441-065b06724681" containerName="proxy-httpd" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.355624 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.361073 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.365269 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.378816 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-f467db6f-7x6cx" podStartSLOduration=6.37879554 podStartE2EDuration="6.37879554s" podCreationTimestamp="2026-02-17 09:22:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:22:51.360093035 +0000 UTC m=+1048.903348691" watchObservedRunningTime="2026-02-17 09:22:51.37879554 +0000 UTC m=+1048.922051186" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.406246 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65acfebd-febe-42db-8441-065b06724681" path="/var/lib/kubelet/pods/65acfebd-febe-42db-8441-065b06724681/volumes" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.407137 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.482247 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/906e1770-f9be-4539-8bb9-443d30a30691-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " pod="openstack/ceilometer-0" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.482306 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t5pk\" (UniqueName: \"kubernetes.io/projected/906e1770-f9be-4539-8bb9-443d30a30691-kube-api-access-4t5pk\") pod \"ceilometer-0\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " pod="openstack/ceilometer-0" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.482371 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/906e1770-f9be-4539-8bb9-443d30a30691-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " pod="openstack/ceilometer-0" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.482428 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/906e1770-f9be-4539-8bb9-443d30a30691-config-data\") pod \"ceilometer-0\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " pod="openstack/ceilometer-0" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.482448 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906e1770-f9be-4539-8bb9-443d30a30691-run-httpd\") pod \"ceilometer-0\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " pod="openstack/ceilometer-0" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.482473 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/906e1770-f9be-4539-8bb9-443d30a30691-scripts\") pod \"ceilometer-0\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " pod="openstack/ceilometer-0" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.482787 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906e1770-f9be-4539-8bb9-443d30a30691-log-httpd\") pod \"ceilometer-0\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " pod="openstack/ceilometer-0" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.584362 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/906e1770-f9be-4539-8bb9-443d30a30691-config-data\") pod \"ceilometer-0\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " pod="openstack/ceilometer-0" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.584402 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906e1770-f9be-4539-8bb9-443d30a30691-run-httpd\") pod \"ceilometer-0\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " pod="openstack/ceilometer-0" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.584441 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/906e1770-f9be-4539-8bb9-443d30a30691-scripts\") pod \"ceilometer-0\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " pod="openstack/ceilometer-0" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.584553 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906e1770-f9be-4539-8bb9-443d30a30691-log-httpd\") pod \"ceilometer-0\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " pod="openstack/ceilometer-0" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.584616 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/906e1770-f9be-4539-8bb9-443d30a30691-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " pod="openstack/ceilometer-0" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.584641 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t5pk\" (UniqueName: \"kubernetes.io/projected/906e1770-f9be-4539-8bb9-443d30a30691-kube-api-access-4t5pk\") pod \"ceilometer-0\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " pod="openstack/ceilometer-0" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.584670 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/906e1770-f9be-4539-8bb9-443d30a30691-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " pod="openstack/ceilometer-0" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.586251 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906e1770-f9be-4539-8bb9-443d30a30691-run-httpd\") pod \"ceilometer-0\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " pod="openstack/ceilometer-0" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.586689 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906e1770-f9be-4539-8bb9-443d30a30691-log-httpd\") pod \"ceilometer-0\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " pod="openstack/ceilometer-0" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.590193 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/906e1770-f9be-4539-8bb9-443d30a30691-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " pod="openstack/ceilometer-0" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.590532 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/906e1770-f9be-4539-8bb9-443d30a30691-config-data\") pod \"ceilometer-0\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " pod="openstack/ceilometer-0" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.591280 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/906e1770-f9be-4539-8bb9-443d30a30691-scripts\") pod \"ceilometer-0\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " pod="openstack/ceilometer-0" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.591810 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/906e1770-f9be-4539-8bb9-443d30a30691-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " pod="openstack/ceilometer-0" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.601280 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t5pk\" (UniqueName: \"kubernetes.io/projected/906e1770-f9be-4539-8bb9-443d30a30691-kube-api-access-4t5pk\") pod \"ceilometer-0\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " pod="openstack/ceilometer-0" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.681183 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.795010 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-749cc47784-q9crv" podUID="1068aa99-55d4-4778-ac02-b354de25d16e" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.150:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.150:8443: connect: connection refused" Feb 17 09:22:51 crc kubenswrapper[4848]: I0217 09:22:51.795120 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:22:52 crc kubenswrapper[4848]: I0217 09:22:52.185354 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:22:52 crc kubenswrapper[4848]: W0217 09:22:52.187781 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod906e1770_f9be_4539_8bb9_443d30a30691.slice/crio-2ab9f4454bc844aa29163f9b6ce9c795b53fae8a61fbe5478160c18ce415d141 WatchSource:0}: Error finding container 2ab9f4454bc844aa29163f9b6ce9c795b53fae8a61fbe5478160c18ce415d141: Status 404 returned error can't find the container with id 2ab9f4454bc844aa29163f9b6ce9c795b53fae8a61fbe5478160c18ce415d141 Feb 17 09:22:52 crc kubenswrapper[4848]: I0217 09:22:52.239725 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906e1770-f9be-4539-8bb9-443d30a30691","Type":"ContainerStarted","Data":"2ab9f4454bc844aa29163f9b6ce9c795b53fae8a61fbe5478160c18ce415d141"} Feb 17 09:22:52 crc kubenswrapper[4848]: I0217 09:22:52.241825 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:52 crc kubenswrapper[4848]: I0217 09:22:52.434909 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-685b8f6845-8tvq5" Feb 17 09:22:52 crc kubenswrapper[4848]: I0217 09:22:52.517572 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6cb4bc9fb8-r52gj"] Feb 17 09:22:52 crc kubenswrapper[4848]: I0217 09:22:52.517865 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6cb4bc9fb8-r52gj" podUID="60ec54d0-c985-48d2-b081-c1082d132e65" containerName="neutron-api" containerID="cri-o://3df3f5833c5cdb4a1c7b5b2236a4f4786eca712945f32218dd228bd723839960" gracePeriod=30 Feb 17 09:22:52 crc kubenswrapper[4848]: I0217 09:22:52.518260 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6cb4bc9fb8-r52gj" podUID="60ec54d0-c985-48d2-b081-c1082d132e65" containerName="neutron-httpd" containerID="cri-o://f6ea2a2dd8077ea22a1495dc5fd86e223ff5877b1de922b517b656dd0480dc14" gracePeriod=30 Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.251322 4848 generic.go:334] "Generic (PLEG): container finished" podID="60ec54d0-c985-48d2-b081-c1082d132e65" containerID="f6ea2a2dd8077ea22a1495dc5fd86e223ff5877b1de922b517b656dd0480dc14" exitCode=0 Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.252415 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6cb4bc9fb8-r52gj" event={"ID":"60ec54d0-c985-48d2-b081-c1082d132e65","Type":"ContainerDied","Data":"f6ea2a2dd8077ea22a1495dc5fd86e223ff5877b1de922b517b656dd0480dc14"} Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.505130 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-xqmdf"] Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.507744 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xqmdf" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.519290 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xqmdf"] Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.533902 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwmtt\" (UniqueName: \"kubernetes.io/projected/2ef72abf-423d-4df8-8d07-f2340b53ddff-kube-api-access-mwmtt\") pod \"nova-api-db-create-xqmdf\" (UID: \"2ef72abf-423d-4df8-8d07-f2340b53ddff\") " pod="openstack/nova-api-db-create-xqmdf" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.533967 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ef72abf-423d-4df8-8d07-f2340b53ddff-operator-scripts\") pod \"nova-api-db-create-xqmdf\" (UID: \"2ef72abf-423d-4df8-8d07-f2340b53ddff\") " pod="openstack/nova-api-db-create-xqmdf" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.618065 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-j9fzr"] Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.629822 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-j9fzr" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.636529 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwmtt\" (UniqueName: \"kubernetes.io/projected/2ef72abf-423d-4df8-8d07-f2340b53ddff-kube-api-access-mwmtt\") pod \"nova-api-db-create-xqmdf\" (UID: \"2ef72abf-423d-4df8-8d07-f2340b53ddff\") " pod="openstack/nova-api-db-create-xqmdf" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.636614 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ef72abf-423d-4df8-8d07-f2340b53ddff-operator-scripts\") pod \"nova-api-db-create-xqmdf\" (UID: \"2ef72abf-423d-4df8-8d07-f2340b53ddff\") " pod="openstack/nova-api-db-create-xqmdf" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.637549 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ef72abf-423d-4df8-8d07-f2340b53ddff-operator-scripts\") pod \"nova-api-db-create-xqmdf\" (UID: \"2ef72abf-423d-4df8-8d07-f2340b53ddff\") " pod="openstack/nova-api-db-create-xqmdf" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.642319 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-j9fzr"] Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.665313 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwmtt\" (UniqueName: \"kubernetes.io/projected/2ef72abf-423d-4df8-8d07-f2340b53ddff-kube-api-access-mwmtt\") pod \"nova-api-db-create-xqmdf\" (UID: \"2ef72abf-423d-4df8-8d07-f2340b53ddff\") " pod="openstack/nova-api-db-create-xqmdf" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.666906 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-5bbe-account-create-update-9j27z"] Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.668015 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5bbe-account-create-update-9j27z" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.671123 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.677346 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5bbe-account-create-update-9j27z"] Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.738278 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0-operator-scripts\") pod \"nova-api-5bbe-account-create-update-9j27z\" (UID: \"cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0\") " pod="openstack/nova-api-5bbe-account-create-update-9j27z" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.738346 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lp7w\" (UniqueName: \"kubernetes.io/projected/cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0-kube-api-access-6lp7w\") pod \"nova-api-5bbe-account-create-update-9j27z\" (UID: \"cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0\") " pod="openstack/nova-api-5bbe-account-create-update-9j27z" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.738380 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7dc6db25-7a23-453e-b3e5-331167e6d51e-operator-scripts\") pod \"nova-cell0-db-create-j9fzr\" (UID: \"7dc6db25-7a23-453e-b3e5-331167e6d51e\") " pod="openstack/nova-cell0-db-create-j9fzr" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.738411 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7lcx\" (UniqueName: \"kubernetes.io/projected/7dc6db25-7a23-453e-b3e5-331167e6d51e-kube-api-access-t7lcx\") pod \"nova-cell0-db-create-j9fzr\" (UID: \"7dc6db25-7a23-453e-b3e5-331167e6d51e\") " pod="openstack/nova-cell0-db-create-j9fzr" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.812974 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-9nsmf"] Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.814352 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9nsmf" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.822024 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-ef1a-account-create-update-szj2h"] Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.823221 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xqmdf" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.823369 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ef1a-account-create-update-szj2h" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.833222 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.838595 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-9nsmf"] Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.840341 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0-operator-scripts\") pod \"nova-api-5bbe-account-create-update-9j27z\" (UID: \"cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0\") " pod="openstack/nova-api-5bbe-account-create-update-9j27z" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.840385 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8p6d\" (UniqueName: \"kubernetes.io/projected/8e88c051-4096-47e8-b941-4cee5e3971ef-kube-api-access-d8p6d\") pod \"nova-cell0-ef1a-account-create-update-szj2h\" (UID: \"8e88c051-4096-47e8-b941-4cee5e3971ef\") " pod="openstack/nova-cell0-ef1a-account-create-update-szj2h" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.840409 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lp7w\" (UniqueName: \"kubernetes.io/projected/cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0-kube-api-access-6lp7w\") pod \"nova-api-5bbe-account-create-update-9j27z\" (UID: \"cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0\") " pod="openstack/nova-api-5bbe-account-create-update-9j27z" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.840432 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9q2q\" (UniqueName: \"kubernetes.io/projected/7e349818-833c-41bd-83dd-00ca201c100c-kube-api-access-c9q2q\") pod \"nova-cell1-db-create-9nsmf\" (UID: \"7e349818-833c-41bd-83dd-00ca201c100c\") " pod="openstack/nova-cell1-db-create-9nsmf" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.840450 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7dc6db25-7a23-453e-b3e5-331167e6d51e-operator-scripts\") pod \"nova-cell0-db-create-j9fzr\" (UID: \"7dc6db25-7a23-453e-b3e5-331167e6d51e\") " pod="openstack/nova-cell0-db-create-j9fzr" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.840473 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7lcx\" (UniqueName: \"kubernetes.io/projected/7dc6db25-7a23-453e-b3e5-331167e6d51e-kube-api-access-t7lcx\") pod \"nova-cell0-db-create-j9fzr\" (UID: \"7dc6db25-7a23-453e-b3e5-331167e6d51e\") " pod="openstack/nova-cell0-db-create-j9fzr" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.840500 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e88c051-4096-47e8-b941-4cee5e3971ef-operator-scripts\") pod \"nova-cell0-ef1a-account-create-update-szj2h\" (UID: \"8e88c051-4096-47e8-b941-4cee5e3971ef\") " pod="openstack/nova-cell0-ef1a-account-create-update-szj2h" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.840526 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e349818-833c-41bd-83dd-00ca201c100c-operator-scripts\") pod \"nova-cell1-db-create-9nsmf\" (UID: \"7e349818-833c-41bd-83dd-00ca201c100c\") " pod="openstack/nova-cell1-db-create-9nsmf" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.841311 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0-operator-scripts\") pod \"nova-api-5bbe-account-create-update-9j27z\" (UID: \"cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0\") " pod="openstack/nova-api-5bbe-account-create-update-9j27z" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.842134 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7dc6db25-7a23-453e-b3e5-331167e6d51e-operator-scripts\") pod \"nova-cell0-db-create-j9fzr\" (UID: \"7dc6db25-7a23-453e-b3e5-331167e6d51e\") " pod="openstack/nova-cell0-db-create-j9fzr" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.850442 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ef1a-account-create-update-szj2h"] Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.868789 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lp7w\" (UniqueName: \"kubernetes.io/projected/cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0-kube-api-access-6lp7w\") pod \"nova-api-5bbe-account-create-update-9j27z\" (UID: \"cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0\") " pod="openstack/nova-api-5bbe-account-create-update-9j27z" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.870064 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7lcx\" (UniqueName: \"kubernetes.io/projected/7dc6db25-7a23-453e-b3e5-331167e6d51e-kube-api-access-t7lcx\") pod \"nova-cell0-db-create-j9fzr\" (UID: \"7dc6db25-7a23-453e-b3e5-331167e6d51e\") " pod="openstack/nova-cell0-db-create-j9fzr" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.943288 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8p6d\" (UniqueName: \"kubernetes.io/projected/8e88c051-4096-47e8-b941-4cee5e3971ef-kube-api-access-d8p6d\") pod \"nova-cell0-ef1a-account-create-update-szj2h\" (UID: \"8e88c051-4096-47e8-b941-4cee5e3971ef\") " pod="openstack/nova-cell0-ef1a-account-create-update-szj2h" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.943654 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9q2q\" (UniqueName: \"kubernetes.io/projected/7e349818-833c-41bd-83dd-00ca201c100c-kube-api-access-c9q2q\") pod \"nova-cell1-db-create-9nsmf\" (UID: \"7e349818-833c-41bd-83dd-00ca201c100c\") " pod="openstack/nova-cell1-db-create-9nsmf" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.943733 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e88c051-4096-47e8-b941-4cee5e3971ef-operator-scripts\") pod \"nova-cell0-ef1a-account-create-update-szj2h\" (UID: \"8e88c051-4096-47e8-b941-4cee5e3971ef\") " pod="openstack/nova-cell0-ef1a-account-create-update-szj2h" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.943799 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e349818-833c-41bd-83dd-00ca201c100c-operator-scripts\") pod \"nova-cell1-db-create-9nsmf\" (UID: \"7e349818-833c-41bd-83dd-00ca201c100c\") " pod="openstack/nova-cell1-db-create-9nsmf" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.944552 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e88c051-4096-47e8-b941-4cee5e3971ef-operator-scripts\") pod \"nova-cell0-ef1a-account-create-update-szj2h\" (UID: \"8e88c051-4096-47e8-b941-4cee5e3971ef\") " pod="openstack/nova-cell0-ef1a-account-create-update-szj2h" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.944732 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e349818-833c-41bd-83dd-00ca201c100c-operator-scripts\") pod \"nova-cell1-db-create-9nsmf\" (UID: \"7e349818-833c-41bd-83dd-00ca201c100c\") " pod="openstack/nova-cell1-db-create-9nsmf" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.961277 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9q2q\" (UniqueName: \"kubernetes.io/projected/7e349818-833c-41bd-83dd-00ca201c100c-kube-api-access-c9q2q\") pod \"nova-cell1-db-create-9nsmf\" (UID: \"7e349818-833c-41bd-83dd-00ca201c100c\") " pod="openstack/nova-cell1-db-create-9nsmf" Feb 17 09:22:53 crc kubenswrapper[4848]: I0217 09:22:53.966312 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8p6d\" (UniqueName: \"kubernetes.io/projected/8e88c051-4096-47e8-b941-4cee5e3971ef-kube-api-access-d8p6d\") pod \"nova-cell0-ef1a-account-create-update-szj2h\" (UID: \"8e88c051-4096-47e8-b941-4cee5e3971ef\") " pod="openstack/nova-cell0-ef1a-account-create-update-szj2h" Feb 17 09:22:54 crc kubenswrapper[4848]: I0217 09:22:54.024202 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-94f2-account-create-update-zdvm6"] Feb 17 09:22:54 crc kubenswrapper[4848]: I0217 09:22:54.025207 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-94f2-account-create-update-zdvm6" Feb 17 09:22:54 crc kubenswrapper[4848]: I0217 09:22:54.032229 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 17 09:22:54 crc kubenswrapper[4848]: I0217 09:22:54.045979 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-94f2-account-create-update-zdvm6"] Feb 17 09:22:54 crc kubenswrapper[4848]: I0217 09:22:54.102465 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-j9fzr" Feb 17 09:22:54 crc kubenswrapper[4848]: I0217 09:22:54.115646 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5bbe-account-create-update-9j27z" Feb 17 09:22:54 crc kubenswrapper[4848]: I0217 09:22:54.150286 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46k82\" (UniqueName: \"kubernetes.io/projected/aa12092e-4e11-4aa1-a495-383d39eb7806-kube-api-access-46k82\") pod \"nova-cell1-94f2-account-create-update-zdvm6\" (UID: \"aa12092e-4e11-4aa1-a495-383d39eb7806\") " pod="openstack/nova-cell1-94f2-account-create-update-zdvm6" Feb 17 09:22:54 crc kubenswrapper[4848]: I0217 09:22:54.150390 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa12092e-4e11-4aa1-a495-383d39eb7806-operator-scripts\") pod \"nova-cell1-94f2-account-create-update-zdvm6\" (UID: \"aa12092e-4e11-4aa1-a495-383d39eb7806\") " pod="openstack/nova-cell1-94f2-account-create-update-zdvm6" Feb 17 09:22:54 crc kubenswrapper[4848]: I0217 09:22:54.151301 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9nsmf" Feb 17 09:22:54 crc kubenswrapper[4848]: I0217 09:22:54.167003 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ef1a-account-create-update-szj2h" Feb 17 09:22:54 crc kubenswrapper[4848]: I0217 09:22:54.198709 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:22:54 crc kubenswrapper[4848]: I0217 09:22:54.255485 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46k82\" (UniqueName: \"kubernetes.io/projected/aa12092e-4e11-4aa1-a495-383d39eb7806-kube-api-access-46k82\") pod \"nova-cell1-94f2-account-create-update-zdvm6\" (UID: \"aa12092e-4e11-4aa1-a495-383d39eb7806\") " pod="openstack/nova-cell1-94f2-account-create-update-zdvm6" Feb 17 09:22:54 crc kubenswrapper[4848]: I0217 09:22:54.255573 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa12092e-4e11-4aa1-a495-383d39eb7806-operator-scripts\") pod \"nova-cell1-94f2-account-create-update-zdvm6\" (UID: \"aa12092e-4e11-4aa1-a495-383d39eb7806\") " pod="openstack/nova-cell1-94f2-account-create-update-zdvm6" Feb 17 09:22:54 crc kubenswrapper[4848]: I0217 09:22:54.256576 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa12092e-4e11-4aa1-a495-383d39eb7806-operator-scripts\") pod \"nova-cell1-94f2-account-create-update-zdvm6\" (UID: \"aa12092e-4e11-4aa1-a495-383d39eb7806\") " pod="openstack/nova-cell1-94f2-account-create-update-zdvm6" Feb 17 09:22:54 crc kubenswrapper[4848]: I0217 09:22:54.268099 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906e1770-f9be-4539-8bb9-443d30a30691","Type":"ContainerStarted","Data":"935e5d1283ea3c059f84d265e9004100f52bb8f7129d2c50479ea50f7b3dbcef"} Feb 17 09:22:54 crc kubenswrapper[4848]: I0217 09:22:54.277185 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46k82\" (UniqueName: \"kubernetes.io/projected/aa12092e-4e11-4aa1-a495-383d39eb7806-kube-api-access-46k82\") pod \"nova-cell1-94f2-account-create-update-zdvm6\" (UID: \"aa12092e-4e11-4aa1-a495-383d39eb7806\") " pod="openstack/nova-cell1-94f2-account-create-update-zdvm6" Feb 17 09:22:54 crc kubenswrapper[4848]: I0217 09:22:54.375650 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-94f2-account-create-update-zdvm6" Feb 17 09:22:54 crc kubenswrapper[4848]: I0217 09:22:54.410396 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xqmdf"] Feb 17 09:22:54 crc kubenswrapper[4848]: I0217 09:22:54.684109 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-j9fzr"] Feb 17 09:22:54 crc kubenswrapper[4848]: W0217 09:22:54.714549 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dc6db25_7a23_453e_b3e5_331167e6d51e.slice/crio-3e26f9b03dcff41a5bdf88b724e1af4cc2f4300560c4200d07d9b7fa0c70b38d WatchSource:0}: Error finding container 3e26f9b03dcff41a5bdf88b724e1af4cc2f4300560c4200d07d9b7fa0c70b38d: Status 404 returned error can't find the container with id 3e26f9b03dcff41a5bdf88b724e1af4cc2f4300560c4200d07d9b7fa0c70b38d Feb 17 09:22:54 crc kubenswrapper[4848]: I0217 09:22:54.733290 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-ef1a-account-create-update-szj2h"] Feb 17 09:22:54 crc kubenswrapper[4848]: W0217 09:22:54.920723 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdacd3e3_6fa8_4964_8a0b_f97b7d2732d0.slice/crio-a28b2e6f8c59b0545cc3dbdf7a6f241a146177c7c18dd1868ddd49c199dc2180 WatchSource:0}: Error finding container a28b2e6f8c59b0545cc3dbdf7a6f241a146177c7c18dd1868ddd49c199dc2180: Status 404 returned error can't find the container with id a28b2e6f8c59b0545cc3dbdf7a6f241a146177c7c18dd1868ddd49c199dc2180 Feb 17 09:22:54 crc kubenswrapper[4848]: I0217 09:22:54.944196 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-5bbe-account-create-update-9j27z"] Feb 17 09:22:54 crc kubenswrapper[4848]: I0217 09:22:54.986981 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-9nsmf"] Feb 17 09:22:55 crc kubenswrapper[4848]: I0217 09:22:55.184259 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-94f2-account-create-update-zdvm6"] Feb 17 09:22:55 crc kubenswrapper[4848]: I0217 09:22:55.279076 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-j9fzr" event={"ID":"7dc6db25-7a23-453e-b3e5-331167e6d51e","Type":"ContainerStarted","Data":"ca6d48b3e6baeb8fa9f233e1a3b83b8aaf878c5e7552689e75c656e2bb10f306"} Feb 17 09:22:55 crc kubenswrapper[4848]: I0217 09:22:55.279119 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-j9fzr" event={"ID":"7dc6db25-7a23-453e-b3e5-331167e6d51e","Type":"ContainerStarted","Data":"3e26f9b03dcff41a5bdf88b724e1af4cc2f4300560c4200d07d9b7fa0c70b38d"} Feb 17 09:22:55 crc kubenswrapper[4848]: I0217 09:22:55.281124 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906e1770-f9be-4539-8bb9-443d30a30691","Type":"ContainerStarted","Data":"98a1a68593f8f7629947e05ce2026f56af30a7867f1f35f8f0c4b0c3faf2092c"} Feb 17 09:22:55 crc kubenswrapper[4848]: I0217 09:22:55.285538 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-94f2-account-create-update-zdvm6" event={"ID":"aa12092e-4e11-4aa1-a495-383d39eb7806","Type":"ContainerStarted","Data":"b6f9fd44cfd6cf0acadd6712208f4278018256076643e45c38b2f5fa798315b9"} Feb 17 09:22:55 crc kubenswrapper[4848]: I0217 09:22:55.288794 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ef1a-account-create-update-szj2h" event={"ID":"8e88c051-4096-47e8-b941-4cee5e3971ef","Type":"ContainerStarted","Data":"9557e8b67a86ebb2d14f8165d0832bd360c4b964765da54ad9056348b2c67203"} Feb 17 09:22:55 crc kubenswrapper[4848]: I0217 09:22:55.293783 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xqmdf" event={"ID":"2ef72abf-423d-4df8-8d07-f2340b53ddff","Type":"ContainerStarted","Data":"df83fb9e444b434e99b55684e3b3dc7cea87afd2e1ac228f53efdf7fa987972b"} Feb 17 09:22:55 crc kubenswrapper[4848]: I0217 09:22:55.293835 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xqmdf" event={"ID":"2ef72abf-423d-4df8-8d07-f2340b53ddff","Type":"ContainerStarted","Data":"861340d2d73dec426287551a575fd81259415747681534ad20e6986eff40d058"} Feb 17 09:22:55 crc kubenswrapper[4848]: I0217 09:22:55.297406 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9nsmf" event={"ID":"7e349818-833c-41bd-83dd-00ca201c100c","Type":"ContainerStarted","Data":"9c0a4e3b676119a86a79b193f65ead0d321ddcece2f01499b8057d7793d3972c"} Feb 17 09:22:55 crc kubenswrapper[4848]: I0217 09:22:55.309571 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-j9fzr" podStartSLOduration=2.309553662 podStartE2EDuration="2.309553662s" podCreationTimestamp="2026-02-17 09:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:22:55.300108097 +0000 UTC m=+1052.843363753" watchObservedRunningTime="2026-02-17 09:22:55.309553662 +0000 UTC m=+1052.852809308" Feb 17 09:22:55 crc kubenswrapper[4848]: I0217 09:22:55.312096 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5bbe-account-create-update-9j27z" event={"ID":"cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0","Type":"ContainerStarted","Data":"a28b2e6f8c59b0545cc3dbdf7a6f241a146177c7c18dd1868ddd49c199dc2180"} Feb 17 09:22:55 crc kubenswrapper[4848]: I0217 09:22:55.318011 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-xqmdf" podStartSLOduration=2.3179981 podStartE2EDuration="2.3179981s" podCreationTimestamp="2026-02-17 09:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:22:55.316357906 +0000 UTC m=+1052.859613552" watchObservedRunningTime="2026-02-17 09:22:55.3179981 +0000 UTC m=+1052.861253746" Feb 17 09:22:55 crc kubenswrapper[4848]: I0217 09:22:55.706787 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:22:56 crc kubenswrapper[4848]: I0217 09:22:56.324166 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906e1770-f9be-4539-8bb9-443d30a30691","Type":"ContainerStarted","Data":"f79528e18c5cc8f692fa67e79a1867e949b33973f383b19bde58125138a2d245"} Feb 17 09:22:56 crc kubenswrapper[4848]: I0217 09:22:56.331390 4848 generic.go:334] "Generic (PLEG): container finished" podID="aa12092e-4e11-4aa1-a495-383d39eb7806" containerID="68ac475da06f9e5cb5def7b9378fb02469850335465993b12efa2f8f2e38f3db" exitCode=0 Feb 17 09:22:56 crc kubenswrapper[4848]: I0217 09:22:56.331471 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-94f2-account-create-update-zdvm6" event={"ID":"aa12092e-4e11-4aa1-a495-383d39eb7806","Type":"ContainerDied","Data":"68ac475da06f9e5cb5def7b9378fb02469850335465993b12efa2f8f2e38f3db"} Feb 17 09:22:56 crc kubenswrapper[4848]: I0217 09:22:56.333979 4848 generic.go:334] "Generic (PLEG): container finished" podID="8e88c051-4096-47e8-b941-4cee5e3971ef" containerID="ac0420ac46441099abbb97df4471c45378300aa407c295c6dc5601869f8cfc14" exitCode=0 Feb 17 09:22:56 crc kubenswrapper[4848]: I0217 09:22:56.334040 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ef1a-account-create-update-szj2h" event={"ID":"8e88c051-4096-47e8-b941-4cee5e3971ef","Type":"ContainerDied","Data":"ac0420ac46441099abbb97df4471c45378300aa407c295c6dc5601869f8cfc14"} Feb 17 09:22:56 crc kubenswrapper[4848]: I0217 09:22:56.335988 4848 generic.go:334] "Generic (PLEG): container finished" podID="2ef72abf-423d-4df8-8d07-f2340b53ddff" containerID="df83fb9e444b434e99b55684e3b3dc7cea87afd2e1ac228f53efdf7fa987972b" exitCode=0 Feb 17 09:22:56 crc kubenswrapper[4848]: I0217 09:22:56.336044 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xqmdf" event={"ID":"2ef72abf-423d-4df8-8d07-f2340b53ddff","Type":"ContainerDied","Data":"df83fb9e444b434e99b55684e3b3dc7cea87afd2e1ac228f53efdf7fa987972b"} Feb 17 09:22:56 crc kubenswrapper[4848]: I0217 09:22:56.338695 4848 generic.go:334] "Generic (PLEG): container finished" podID="7e349818-833c-41bd-83dd-00ca201c100c" containerID="9c748faf11b96df9c24f648de549bf9743322f6ee4f0112dc2c322a8f7e8b421" exitCode=0 Feb 17 09:22:56 crc kubenswrapper[4848]: I0217 09:22:56.338750 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9nsmf" event={"ID":"7e349818-833c-41bd-83dd-00ca201c100c","Type":"ContainerDied","Data":"9c748faf11b96df9c24f648de549bf9743322f6ee4f0112dc2c322a8f7e8b421"} Feb 17 09:22:56 crc kubenswrapper[4848]: I0217 09:22:56.340213 4848 generic.go:334] "Generic (PLEG): container finished" podID="cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0" containerID="e66fb734dca6f821d231e6f7e98ca5f789b1694c1ec5e60d44cd0205a33b0419" exitCode=0 Feb 17 09:22:56 crc kubenswrapper[4848]: I0217 09:22:56.340269 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5bbe-account-create-update-9j27z" event={"ID":"cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0","Type":"ContainerDied","Data":"e66fb734dca6f821d231e6f7e98ca5f789b1694c1ec5e60d44cd0205a33b0419"} Feb 17 09:22:56 crc kubenswrapper[4848]: I0217 09:22:56.341783 4848 generic.go:334] "Generic (PLEG): container finished" podID="7dc6db25-7a23-453e-b3e5-331167e6d51e" containerID="ca6d48b3e6baeb8fa9f233e1a3b83b8aaf878c5e7552689e75c656e2bb10f306" exitCode=0 Feb 17 09:22:56 crc kubenswrapper[4848]: I0217 09:22:56.341816 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-j9fzr" event={"ID":"7dc6db25-7a23-453e-b3e5-331167e6d51e","Type":"ContainerDied","Data":"ca6d48b3e6baeb8fa9f233e1a3b83b8aaf878c5e7552689e75c656e2bb10f306"} Feb 17 09:22:57 crc kubenswrapper[4848]: I0217 09:22:57.861127 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ef1a-account-create-update-szj2h" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.005367 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8p6d\" (UniqueName: \"kubernetes.io/projected/8e88c051-4096-47e8-b941-4cee5e3971ef-kube-api-access-d8p6d\") pod \"8e88c051-4096-47e8-b941-4cee5e3971ef\" (UID: \"8e88c051-4096-47e8-b941-4cee5e3971ef\") " Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.005795 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e88c051-4096-47e8-b941-4cee5e3971ef-operator-scripts\") pod \"8e88c051-4096-47e8-b941-4cee5e3971ef\" (UID: \"8e88c051-4096-47e8-b941-4cee5e3971ef\") " Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.006547 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e88c051-4096-47e8-b941-4cee5e3971ef-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8e88c051-4096-47e8-b941-4cee5e3971ef" (UID: "8e88c051-4096-47e8-b941-4cee5e3971ef"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.013690 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e88c051-4096-47e8-b941-4cee5e3971ef-kube-api-access-d8p6d" (OuterVolumeSpecName: "kube-api-access-d8p6d") pod "8e88c051-4096-47e8-b941-4cee5e3971ef" (UID: "8e88c051-4096-47e8-b941-4cee5e3971ef"). InnerVolumeSpecName "kube-api-access-d8p6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.038921 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-j9fzr" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.076318 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9nsmf" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.083483 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5bbe-account-create-update-9j27z" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.111305 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-94f2-account-create-update-zdvm6" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.121543 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xqmdf" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.121972 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7lcx\" (UniqueName: \"kubernetes.io/projected/7dc6db25-7a23-453e-b3e5-331167e6d51e-kube-api-access-t7lcx\") pod \"7dc6db25-7a23-453e-b3e5-331167e6d51e\" (UID: \"7dc6db25-7a23-453e-b3e5-331167e6d51e\") " Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.122065 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7dc6db25-7a23-453e-b3e5-331167e6d51e-operator-scripts\") pod \"7dc6db25-7a23-453e-b3e5-331167e6d51e\" (UID: \"7dc6db25-7a23-453e-b3e5-331167e6d51e\") " Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.122637 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dc6db25-7a23-453e-b3e5-331167e6d51e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7dc6db25-7a23-453e-b3e5-331167e6d51e" (UID: "7dc6db25-7a23-453e-b3e5-331167e6d51e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.131470 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e88c051-4096-47e8-b941-4cee5e3971ef-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.131517 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7dc6db25-7a23-453e-b3e5-331167e6d51e-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.131528 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8p6d\" (UniqueName: \"kubernetes.io/projected/8e88c051-4096-47e8-b941-4cee5e3971ef-kube-api-access-d8p6d\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.133873 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dc6db25-7a23-453e-b3e5-331167e6d51e-kube-api-access-t7lcx" (OuterVolumeSpecName: "kube-api-access-t7lcx") pod "7dc6db25-7a23-453e-b3e5-331167e6d51e" (UID: "7dc6db25-7a23-453e-b3e5-331167e6d51e"). InnerVolumeSpecName "kube-api-access-t7lcx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.232772 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lp7w\" (UniqueName: \"kubernetes.io/projected/cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0-kube-api-access-6lp7w\") pod \"cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0\" (UID: \"cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0\") " Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.232835 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9q2q\" (UniqueName: \"kubernetes.io/projected/7e349818-833c-41bd-83dd-00ca201c100c-kube-api-access-c9q2q\") pod \"7e349818-833c-41bd-83dd-00ca201c100c\" (UID: \"7e349818-833c-41bd-83dd-00ca201c100c\") " Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.232899 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ef72abf-423d-4df8-8d07-f2340b53ddff-operator-scripts\") pod \"2ef72abf-423d-4df8-8d07-f2340b53ddff\" (UID: \"2ef72abf-423d-4df8-8d07-f2340b53ddff\") " Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.232938 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46k82\" (UniqueName: \"kubernetes.io/projected/aa12092e-4e11-4aa1-a495-383d39eb7806-kube-api-access-46k82\") pod \"aa12092e-4e11-4aa1-a495-383d39eb7806\" (UID: \"aa12092e-4e11-4aa1-a495-383d39eb7806\") " Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.233055 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0-operator-scripts\") pod \"cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0\" (UID: \"cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0\") " Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.233109 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa12092e-4e11-4aa1-a495-383d39eb7806-operator-scripts\") pod \"aa12092e-4e11-4aa1-a495-383d39eb7806\" (UID: \"aa12092e-4e11-4aa1-a495-383d39eb7806\") " Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.233138 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e349818-833c-41bd-83dd-00ca201c100c-operator-scripts\") pod \"7e349818-833c-41bd-83dd-00ca201c100c\" (UID: \"7e349818-833c-41bd-83dd-00ca201c100c\") " Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.233163 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mwmtt\" (UniqueName: \"kubernetes.io/projected/2ef72abf-423d-4df8-8d07-f2340b53ddff-kube-api-access-mwmtt\") pod \"2ef72abf-423d-4df8-8d07-f2340b53ddff\" (UID: \"2ef72abf-423d-4df8-8d07-f2340b53ddff\") " Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.233454 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ef72abf-423d-4df8-8d07-f2340b53ddff-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2ef72abf-423d-4df8-8d07-f2340b53ddff" (UID: "2ef72abf-423d-4df8-8d07-f2340b53ddff"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.233526 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0" (UID: "cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.233912 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ef72abf-423d-4df8-8d07-f2340b53ddff-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.233928 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.233938 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7lcx\" (UniqueName: \"kubernetes.io/projected/7dc6db25-7a23-453e-b3e5-331167e6d51e-kube-api-access-t7lcx\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.233969 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e349818-833c-41bd-83dd-00ca201c100c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7e349818-833c-41bd-83dd-00ca201c100c" (UID: "7e349818-833c-41bd-83dd-00ca201c100c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.234014 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa12092e-4e11-4aa1-a495-383d39eb7806-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aa12092e-4e11-4aa1-a495-383d39eb7806" (UID: "aa12092e-4e11-4aa1-a495-383d39eb7806"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.237681 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0-kube-api-access-6lp7w" (OuterVolumeSpecName: "kube-api-access-6lp7w") pod "cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0" (UID: "cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0"). InnerVolumeSpecName "kube-api-access-6lp7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.237977 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e349818-833c-41bd-83dd-00ca201c100c-kube-api-access-c9q2q" (OuterVolumeSpecName: "kube-api-access-c9q2q") pod "7e349818-833c-41bd-83dd-00ca201c100c" (UID: "7e349818-833c-41bd-83dd-00ca201c100c"). InnerVolumeSpecName "kube-api-access-c9q2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.238093 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa12092e-4e11-4aa1-a495-383d39eb7806-kube-api-access-46k82" (OuterVolumeSpecName: "kube-api-access-46k82") pod "aa12092e-4e11-4aa1-a495-383d39eb7806" (UID: "aa12092e-4e11-4aa1-a495-383d39eb7806"). InnerVolumeSpecName "kube-api-access-46k82". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.247836 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ef72abf-423d-4df8-8d07-f2340b53ddff-kube-api-access-mwmtt" (OuterVolumeSpecName: "kube-api-access-mwmtt") pod "2ef72abf-423d-4df8-8d07-f2340b53ddff" (UID: "2ef72abf-423d-4df8-8d07-f2340b53ddff"). InnerVolumeSpecName "kube-api-access-mwmtt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.335838 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aa12092e-4e11-4aa1-a495-383d39eb7806-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.335875 4848 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7e349818-833c-41bd-83dd-00ca201c100c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.335888 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mwmtt\" (UniqueName: \"kubernetes.io/projected/2ef72abf-423d-4df8-8d07-f2340b53ddff-kube-api-access-mwmtt\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.335897 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lp7w\" (UniqueName: \"kubernetes.io/projected/cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0-kube-api-access-6lp7w\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.335908 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9q2q\" (UniqueName: \"kubernetes.io/projected/7e349818-833c-41bd-83dd-00ca201c100c-kube-api-access-c9q2q\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.335917 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46k82\" (UniqueName: \"kubernetes.io/projected/aa12092e-4e11-4aa1-a495-383d39eb7806-kube-api-access-46k82\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.501502 4848 generic.go:334] "Generic (PLEG): container finished" podID="60ec54d0-c985-48d2-b081-c1082d132e65" containerID="3df3f5833c5cdb4a1c7b5b2236a4f4786eca712945f32218dd228bd723839960" exitCode=0 Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.501963 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6cb4bc9fb8-r52gj" event={"ID":"60ec54d0-c985-48d2-b081-c1082d132e65","Type":"ContainerDied","Data":"3df3f5833c5cdb4a1c7b5b2236a4f4786eca712945f32218dd228bd723839960"} Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.525166 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-9nsmf" event={"ID":"7e349818-833c-41bd-83dd-00ca201c100c","Type":"ContainerDied","Data":"9c0a4e3b676119a86a79b193f65ead0d321ddcece2f01499b8057d7793d3972c"} Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.525204 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c0a4e3b676119a86a79b193f65ead0d321ddcece2f01499b8057d7793d3972c" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.525263 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-9nsmf" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.547039 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-5bbe-account-create-update-9j27z" event={"ID":"cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0","Type":"ContainerDied","Data":"a28b2e6f8c59b0545cc3dbdf7a6f241a146177c7c18dd1868ddd49c199dc2180"} Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.547070 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a28b2e6f8c59b0545cc3dbdf7a6f241a146177c7c18dd1868ddd49c199dc2180" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.547124 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-5bbe-account-create-update-9j27z" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.590077 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-j9fzr" event={"ID":"7dc6db25-7a23-453e-b3e5-331167e6d51e","Type":"ContainerDied","Data":"3e26f9b03dcff41a5bdf88b724e1af4cc2f4300560c4200d07d9b7fa0c70b38d"} Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.590112 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e26f9b03dcff41a5bdf88b724e1af4cc2f4300560c4200d07d9b7fa0c70b38d" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.590118 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-j9fzr" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.608608 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.624164 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="906e1770-f9be-4539-8bb9-443d30a30691" containerName="ceilometer-central-agent" containerID="cri-o://935e5d1283ea3c059f84d265e9004100f52bb8f7129d2c50479ea50f7b3dbcef" gracePeriod=30 Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.624257 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="906e1770-f9be-4539-8bb9-443d30a30691" containerName="proxy-httpd" containerID="cri-o://45430d78172ca5d50a6dc833341639743bbc6f8c79cee072e40ca09cff63e39b" gracePeriod=30 Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.624304 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="906e1770-f9be-4539-8bb9-443d30a30691" containerName="ceilometer-notification-agent" containerID="cri-o://98a1a68593f8f7629947e05ce2026f56af30a7867f1f35f8f0c4b0c3faf2092c" gracePeriod=30 Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.624427 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="906e1770-f9be-4539-8bb9-443d30a30691" containerName="sg-core" containerID="cri-o://f79528e18c5cc8f692fa67e79a1867e949b33973f383b19bde58125138a2d245" gracePeriod=30 Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.624080 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906e1770-f9be-4539-8bb9-443d30a30691","Type":"ContainerStarted","Data":"45430d78172ca5d50a6dc833341639743bbc6f8c79cee072e40ca09cff63e39b"} Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.624937 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.627313 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-94f2-account-create-update-zdvm6" event={"ID":"aa12092e-4e11-4aa1-a495-383d39eb7806","Type":"ContainerDied","Data":"b6f9fd44cfd6cf0acadd6712208f4278018256076643e45c38b2f5fa798315b9"} Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.627339 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6f9fd44cfd6cf0acadd6712208f4278018256076643e45c38b2f5fa798315b9" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.627382 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-94f2-account-create-update-zdvm6" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.651510 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-ef1a-account-create-update-szj2h" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.651513 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-ef1a-account-create-update-szj2h" event={"ID":"8e88c051-4096-47e8-b941-4cee5e3971ef","Type":"ContainerDied","Data":"9557e8b67a86ebb2d14f8165d0832bd360c4b964765da54ad9056348b2c67203"} Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.651559 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9557e8b67a86ebb2d14f8165d0832bd360c4b964765da54ad9056348b2c67203" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.667325 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.626454357 podStartE2EDuration="7.667307453s" podCreationTimestamp="2026-02-17 09:22:51 +0000 UTC" firstStartedPulling="2026-02-17 09:22:52.1902596 +0000 UTC m=+1049.733515246" lastFinishedPulling="2026-02-17 09:22:57.231112696 +0000 UTC m=+1054.774368342" observedRunningTime="2026-02-17 09:22:58.666386489 +0000 UTC m=+1056.209642135" watchObservedRunningTime="2026-02-17 09:22:58.667307453 +0000 UTC m=+1056.210563099" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.667834 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xqmdf" event={"ID":"2ef72abf-423d-4df8-8d07-f2340b53ddff","Type":"ContainerDied","Data":"861340d2d73dec426287551a575fd81259415747681534ad20e6986eff40d058"} Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.667875 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="861340d2d73dec426287551a575fd81259415747681534ad20e6986eff40d058" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.667947 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xqmdf" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.676728 4848 generic.go:334] "Generic (PLEG): container finished" podID="1068aa99-55d4-4778-ac02-b354de25d16e" containerID="479b497cdf95e27b9cf69855fe7ee2bce4facd9452020fe51dd99e0da1ac4e2a" exitCode=137 Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.676791 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-749cc47784-q9crv" event={"ID":"1068aa99-55d4-4778-ac02-b354de25d16e","Type":"ContainerDied","Data":"479b497cdf95e27b9cf69855fe7ee2bce4facd9452020fe51dd99e0da1ac4e2a"} Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.676826 4848 scope.go:117] "RemoveContainer" containerID="0cafaca3851618f0ffdb14a0cf40a0bb789ede4dd0cfff526ba29a414b7a3d5c" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.676987 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-749cc47784-q9crv" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.692093 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6cb4bc9fb8-r52gj" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.744593 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1068aa99-55d4-4778-ac02-b354de25d16e-horizon-tls-certs\") pod \"1068aa99-55d4-4778-ac02-b354de25d16e\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.744643 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1068aa99-55d4-4778-ac02-b354de25d16e-combined-ca-bundle\") pod \"1068aa99-55d4-4778-ac02-b354de25d16e\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.744675 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1068aa99-55d4-4778-ac02-b354de25d16e-horizon-secret-key\") pod \"1068aa99-55d4-4778-ac02-b354de25d16e\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.744848 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1068aa99-55d4-4778-ac02-b354de25d16e-logs\") pod \"1068aa99-55d4-4778-ac02-b354de25d16e\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.744895 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1068aa99-55d4-4778-ac02-b354de25d16e-config-data\") pod \"1068aa99-55d4-4778-ac02-b354de25d16e\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.744940 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1068aa99-55d4-4778-ac02-b354de25d16e-scripts\") pod \"1068aa99-55d4-4778-ac02-b354de25d16e\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.744963 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2szpc\" (UniqueName: \"kubernetes.io/projected/1068aa99-55d4-4778-ac02-b354de25d16e-kube-api-access-2szpc\") pod \"1068aa99-55d4-4778-ac02-b354de25d16e\" (UID: \"1068aa99-55d4-4778-ac02-b354de25d16e\") " Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.746604 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1068aa99-55d4-4778-ac02-b354de25d16e-logs" (OuterVolumeSpecName: "logs") pod "1068aa99-55d4-4778-ac02-b354de25d16e" (UID: "1068aa99-55d4-4778-ac02-b354de25d16e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.749508 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1068aa99-55d4-4778-ac02-b354de25d16e-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "1068aa99-55d4-4778-ac02-b354de25d16e" (UID: "1068aa99-55d4-4778-ac02-b354de25d16e"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.749672 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1068aa99-55d4-4778-ac02-b354de25d16e-kube-api-access-2szpc" (OuterVolumeSpecName: "kube-api-access-2szpc") pod "1068aa99-55d4-4778-ac02-b354de25d16e" (UID: "1068aa99-55d4-4778-ac02-b354de25d16e"). InnerVolumeSpecName "kube-api-access-2szpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.782732 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1068aa99-55d4-4778-ac02-b354de25d16e-config-data" (OuterVolumeSpecName: "config-data") pod "1068aa99-55d4-4778-ac02-b354de25d16e" (UID: "1068aa99-55d4-4778-ac02-b354de25d16e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.784624 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1068aa99-55d4-4778-ac02-b354de25d16e-scripts" (OuterVolumeSpecName: "scripts") pod "1068aa99-55d4-4778-ac02-b354de25d16e" (UID: "1068aa99-55d4-4778-ac02-b354de25d16e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.793878 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1068aa99-55d4-4778-ac02-b354de25d16e-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "1068aa99-55d4-4778-ac02-b354de25d16e" (UID: "1068aa99-55d4-4778-ac02-b354de25d16e"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.813858 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1068aa99-55d4-4778-ac02-b354de25d16e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1068aa99-55d4-4778-ac02-b354de25d16e" (UID: "1068aa99-55d4-4778-ac02-b354de25d16e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.847256 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/60ec54d0-c985-48d2-b081-c1082d132e65-config\") pod \"60ec54d0-c985-48d2-b081-c1082d132e65\" (UID: \"60ec54d0-c985-48d2-b081-c1082d132e65\") " Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.847437 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ec54d0-c985-48d2-b081-c1082d132e65-combined-ca-bundle\") pod \"60ec54d0-c985-48d2-b081-c1082d132e65\" (UID: \"60ec54d0-c985-48d2-b081-c1082d132e65\") " Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.847482 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/60ec54d0-c985-48d2-b081-c1082d132e65-ovndb-tls-certs\") pod \"60ec54d0-c985-48d2-b081-c1082d132e65\" (UID: \"60ec54d0-c985-48d2-b081-c1082d132e65\") " Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.847575 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/60ec54d0-c985-48d2-b081-c1082d132e65-httpd-config\") pod \"60ec54d0-c985-48d2-b081-c1082d132e65\" (UID: \"60ec54d0-c985-48d2-b081-c1082d132e65\") " Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.847658 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cg6dv\" (UniqueName: \"kubernetes.io/projected/60ec54d0-c985-48d2-b081-c1082d132e65-kube-api-access-cg6dv\") pod \"60ec54d0-c985-48d2-b081-c1082d132e65\" (UID: \"60ec54d0-c985-48d2-b081-c1082d132e65\") " Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.848213 4848 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/1068aa99-55d4-4778-ac02-b354de25d16e-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.848236 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1068aa99-55d4-4778-ac02-b354de25d16e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.848249 4848 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/1068aa99-55d4-4778-ac02-b354de25d16e-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.848262 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1068aa99-55d4-4778-ac02-b354de25d16e-logs\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.848273 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/1068aa99-55d4-4778-ac02-b354de25d16e-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.848283 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1068aa99-55d4-4778-ac02-b354de25d16e-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.848293 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2szpc\" (UniqueName: \"kubernetes.io/projected/1068aa99-55d4-4778-ac02-b354de25d16e-kube-api-access-2szpc\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.855049 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60ec54d0-c985-48d2-b081-c1082d132e65-kube-api-access-cg6dv" (OuterVolumeSpecName: "kube-api-access-cg6dv") pod "60ec54d0-c985-48d2-b081-c1082d132e65" (UID: "60ec54d0-c985-48d2-b081-c1082d132e65"). InnerVolumeSpecName "kube-api-access-cg6dv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.858938 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60ec54d0-c985-48d2-b081-c1082d132e65-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "60ec54d0-c985-48d2-b081-c1082d132e65" (UID: "60ec54d0-c985-48d2-b081-c1082d132e65"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.889100 4848 scope.go:117] "RemoveContainer" containerID="479b497cdf95e27b9cf69855fe7ee2bce4facd9452020fe51dd99e0da1ac4e2a" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.926301 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60ec54d0-c985-48d2-b081-c1082d132e65-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60ec54d0-c985-48d2-b081-c1082d132e65" (UID: "60ec54d0-c985-48d2-b081-c1082d132e65"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.947357 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60ec54d0-c985-48d2-b081-c1082d132e65-config" (OuterVolumeSpecName: "config") pod "60ec54d0-c985-48d2-b081-c1082d132e65" (UID: "60ec54d0-c985-48d2-b081-c1082d132e65"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.950143 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/60ec54d0-c985-48d2-b081-c1082d132e65-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.950175 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60ec54d0-c985-48d2-b081-c1082d132e65-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.950184 4848 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/60ec54d0-c985-48d2-b081-c1082d132e65-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.950194 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cg6dv\" (UniqueName: \"kubernetes.io/projected/60ec54d0-c985-48d2-b081-c1082d132e65-kube-api-access-cg6dv\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:58 crc kubenswrapper[4848]: I0217 09:22:58.960819 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60ec54d0-c985-48d2-b081-c1082d132e65-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "60ec54d0-c985-48d2-b081-c1082d132e65" (UID: "60ec54d0-c985-48d2-b081-c1082d132e65"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:22:59 crc kubenswrapper[4848]: I0217 09:22:59.014889 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-749cc47784-q9crv"] Feb 17 09:22:59 crc kubenswrapper[4848]: I0217 09:22:59.022127 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-749cc47784-q9crv"] Feb 17 09:22:59 crc kubenswrapper[4848]: I0217 09:22:59.052149 4848 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/60ec54d0-c985-48d2-b081-c1082d132e65-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 09:22:59 crc kubenswrapper[4848]: I0217 09:22:59.398166 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1068aa99-55d4-4778-ac02-b354de25d16e" path="/var/lib/kubelet/pods/1068aa99-55d4-4778-ac02-b354de25d16e/volumes" Feb 17 09:22:59 crc kubenswrapper[4848]: I0217 09:22:59.458820 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 09:22:59 crc kubenswrapper[4848]: I0217 09:22:59.459307 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e81fa19e-dc1a-4f09-8f05-b28099ae5f03" containerName="glance-log" containerID="cri-o://c24f289f8c51994798352ea14feaba690fc4d2d2b20d674b644427de659a078d" gracePeriod=30 Feb 17 09:22:59 crc kubenswrapper[4848]: I0217 09:22:59.459747 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="e81fa19e-dc1a-4f09-8f05-b28099ae5f03" containerName="glance-httpd" containerID="cri-o://0a78c085be1066ed063fef22ab745f758a882d08a488417f71ebd578f2619f8f" gracePeriod=30 Feb 17 09:22:59 crc kubenswrapper[4848]: I0217 09:22:59.685685 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6cb4bc9fb8-r52gj" event={"ID":"60ec54d0-c985-48d2-b081-c1082d132e65","Type":"ContainerDied","Data":"fb743eed00414d9aec17782d468240a35660f5722df178e955a50ce240ef2cc8"} Feb 17 09:22:59 crc kubenswrapper[4848]: I0217 09:22:59.685745 4848 scope.go:117] "RemoveContainer" containerID="f6ea2a2dd8077ea22a1495dc5fd86e223ff5877b1de922b517b656dd0480dc14" Feb 17 09:22:59 crc kubenswrapper[4848]: I0217 09:22:59.685790 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6cb4bc9fb8-r52gj" Feb 17 09:22:59 crc kubenswrapper[4848]: I0217 09:22:59.688720 4848 generic.go:334] "Generic (PLEG): container finished" podID="e81fa19e-dc1a-4f09-8f05-b28099ae5f03" containerID="c24f289f8c51994798352ea14feaba690fc4d2d2b20d674b644427de659a078d" exitCode=143 Feb 17 09:22:59 crc kubenswrapper[4848]: I0217 09:22:59.688799 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e81fa19e-dc1a-4f09-8f05-b28099ae5f03","Type":"ContainerDied","Data":"c24f289f8c51994798352ea14feaba690fc4d2d2b20d674b644427de659a078d"} Feb 17 09:22:59 crc kubenswrapper[4848]: I0217 09:22:59.695200 4848 generic.go:334] "Generic (PLEG): container finished" podID="906e1770-f9be-4539-8bb9-443d30a30691" containerID="45430d78172ca5d50a6dc833341639743bbc6f8c79cee072e40ca09cff63e39b" exitCode=0 Feb 17 09:22:59 crc kubenswrapper[4848]: I0217 09:22:59.695218 4848 generic.go:334] "Generic (PLEG): container finished" podID="906e1770-f9be-4539-8bb9-443d30a30691" containerID="f79528e18c5cc8f692fa67e79a1867e949b33973f383b19bde58125138a2d245" exitCode=2 Feb 17 09:22:59 crc kubenswrapper[4848]: I0217 09:22:59.695226 4848 generic.go:334] "Generic (PLEG): container finished" podID="906e1770-f9be-4539-8bb9-443d30a30691" containerID="98a1a68593f8f7629947e05ce2026f56af30a7867f1f35f8f0c4b0c3faf2092c" exitCode=0 Feb 17 09:22:59 crc kubenswrapper[4848]: I0217 09:22:59.695241 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906e1770-f9be-4539-8bb9-443d30a30691","Type":"ContainerDied","Data":"45430d78172ca5d50a6dc833341639743bbc6f8c79cee072e40ca09cff63e39b"} Feb 17 09:22:59 crc kubenswrapper[4848]: I0217 09:22:59.695255 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906e1770-f9be-4539-8bb9-443d30a30691","Type":"ContainerDied","Data":"f79528e18c5cc8f692fa67e79a1867e949b33973f383b19bde58125138a2d245"} Feb 17 09:22:59 crc kubenswrapper[4848]: I0217 09:22:59.695264 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906e1770-f9be-4539-8bb9-443d30a30691","Type":"ContainerDied","Data":"98a1a68593f8f7629947e05ce2026f56af30a7867f1f35f8f0c4b0c3faf2092c"} Feb 17 09:22:59 crc kubenswrapper[4848]: I0217 09:22:59.709981 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6cb4bc9fb8-r52gj"] Feb 17 09:22:59 crc kubenswrapper[4848]: I0217 09:22:59.711633 4848 scope.go:117] "RemoveContainer" containerID="3df3f5833c5cdb4a1c7b5b2236a4f4786eca712945f32218dd228bd723839960" Feb 17 09:22:59 crc kubenswrapper[4848]: I0217 09:22:59.731448 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6cb4bc9fb8-r52gj"] Feb 17 09:23:00 crc kubenswrapper[4848]: I0217 09:23:00.582369 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 09:23:00 crc kubenswrapper[4848]: I0217 09:23:00.582660 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4" containerName="glance-log" containerID="cri-o://05231b5d6f5ec5cbc776f080a0317e40903d96f52f99d2628752bfa1c939efc7" gracePeriod=30 Feb 17 09:23:00 crc kubenswrapper[4848]: I0217 09:23:00.583206 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4" containerName="glance-httpd" containerID="cri-o://7df204035a6f669c198c02047c8cbb1aabf451d20bed321fab8885c38c9f4247" gracePeriod=30 Feb 17 09:23:00 crc kubenswrapper[4848]: I0217 09:23:00.716686 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-f467db6f-7x6cx" Feb 17 09:23:01 crc kubenswrapper[4848]: I0217 09:23:01.394465 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60ec54d0-c985-48d2-b081-c1082d132e65" path="/var/lib/kubelet/pods/60ec54d0-c985-48d2-b081-c1082d132e65/volumes" Feb 17 09:23:01 crc kubenswrapper[4848]: I0217 09:23:01.716062 4848 generic.go:334] "Generic (PLEG): container finished" podID="2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4" containerID="05231b5d6f5ec5cbc776f080a0317e40903d96f52f99d2628752bfa1c939efc7" exitCode=143 Feb 17 09:23:01 crc kubenswrapper[4848]: I0217 09:23:01.716153 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4","Type":"ContainerDied","Data":"05231b5d6f5ec5cbc776f080a0317e40903d96f52f99d2628752bfa1c939efc7"} Feb 17 09:23:02 crc kubenswrapper[4848]: I0217 09:23:02.740699 4848 generic.go:334] "Generic (PLEG): container finished" podID="e81fa19e-dc1a-4f09-8f05-b28099ae5f03" containerID="0a78c085be1066ed063fef22ab745f758a882d08a488417f71ebd578f2619f8f" exitCode=0 Feb 17 09:23:02 crc kubenswrapper[4848]: I0217 09:23:02.740740 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e81fa19e-dc1a-4f09-8f05-b28099ae5f03","Type":"ContainerDied","Data":"0a78c085be1066ed063fef22ab745f758a882d08a488417f71ebd578f2619f8f"} Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.182049 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.328977 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-scripts\") pod \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.329026 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5d7l6\" (UniqueName: \"kubernetes.io/projected/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-kube-api-access-5d7l6\") pod \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.329060 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-httpd-run\") pod \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.329089 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-public-tls-certs\") pod \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.329106 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-config-data\") pod \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.329190 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-combined-ca-bundle\") pod \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.329210 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.329235 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-logs\") pod \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\" (UID: \"e81fa19e-dc1a-4f09-8f05-b28099ae5f03\") " Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.330024 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-logs" (OuterVolumeSpecName: "logs") pod "e81fa19e-dc1a-4f09-8f05-b28099ae5f03" (UID: "e81fa19e-dc1a-4f09-8f05-b28099ae5f03"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.330572 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e81fa19e-dc1a-4f09-8f05-b28099ae5f03" (UID: "e81fa19e-dc1a-4f09-8f05-b28099ae5f03"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.335605 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-kube-api-access-5d7l6" (OuterVolumeSpecName: "kube-api-access-5d7l6") pod "e81fa19e-dc1a-4f09-8f05-b28099ae5f03" (UID: "e81fa19e-dc1a-4f09-8f05-b28099ae5f03"). InnerVolumeSpecName "kube-api-access-5d7l6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.337572 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-scripts" (OuterVolumeSpecName: "scripts") pod "e81fa19e-dc1a-4f09-8f05-b28099ae5f03" (UID: "e81fa19e-dc1a-4f09-8f05-b28099ae5f03"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.337685 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "e81fa19e-dc1a-4f09-8f05-b28099ae5f03" (UID: "e81fa19e-dc1a-4f09-8f05-b28099ae5f03"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.376139 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e81fa19e-dc1a-4f09-8f05-b28099ae5f03" (UID: "e81fa19e-dc1a-4f09-8f05-b28099ae5f03"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.419123 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e81fa19e-dc1a-4f09-8f05-b28099ae5f03" (UID: "e81fa19e-dc1a-4f09-8f05-b28099ae5f03"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.430894 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-config-data" (OuterVolumeSpecName: "config-data") pod "e81fa19e-dc1a-4f09-8f05-b28099ae5f03" (UID: "e81fa19e-dc1a-4f09-8f05-b28099ae5f03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.430892 4848 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.430956 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.430989 4848 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.431002 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-logs\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.431014 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.431026 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5d7l6\" (UniqueName: \"kubernetes.io/projected/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-kube-api-access-5d7l6\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.431039 4848 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.452339 4848 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.532656 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e81fa19e-dc1a-4f09-8f05-b28099ae5f03-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.533015 4848 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.749661 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"e81fa19e-dc1a-4f09-8f05-b28099ae5f03","Type":"ContainerDied","Data":"ed8525886dba4381a8ff46086a4bf6ff83e3beaa89f6091e6d44a5eff3cd732a"} Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.749715 4848 scope.go:117] "RemoveContainer" containerID="0a78c085be1066ed063fef22ab745f758a882d08a488417f71ebd578f2619f8f" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.749720 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.894252 4848 scope.go:117] "RemoveContainer" containerID="c24f289f8c51994798352ea14feaba690fc4d2d2b20d674b644427de659a078d" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.909607 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.922973 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.955367 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 09:23:03 crc kubenswrapper[4848]: E0217 09:23:03.955834 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef72abf-423d-4df8-8d07-f2340b53ddff" containerName="mariadb-database-create" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.955851 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef72abf-423d-4df8-8d07-f2340b53ddff" containerName="mariadb-database-create" Feb 17 09:23:03 crc kubenswrapper[4848]: E0217 09:23:03.955870 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1068aa99-55d4-4778-ac02-b354de25d16e" containerName="horizon-log" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.955879 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="1068aa99-55d4-4778-ac02-b354de25d16e" containerName="horizon-log" Feb 17 09:23:03 crc kubenswrapper[4848]: E0217 09:23:03.955895 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0" containerName="mariadb-account-create-update" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.955904 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0" containerName="mariadb-account-create-update" Feb 17 09:23:03 crc kubenswrapper[4848]: E0217 09:23:03.955921 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e349818-833c-41bd-83dd-00ca201c100c" containerName="mariadb-database-create" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.955928 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e349818-833c-41bd-83dd-00ca201c100c" containerName="mariadb-database-create" Feb 17 09:23:03 crc kubenswrapper[4848]: E0217 09:23:03.955944 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1068aa99-55d4-4778-ac02-b354de25d16e" containerName="horizon" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.955952 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="1068aa99-55d4-4778-ac02-b354de25d16e" containerName="horizon" Feb 17 09:23:03 crc kubenswrapper[4848]: E0217 09:23:03.955965 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60ec54d0-c985-48d2-b081-c1082d132e65" containerName="neutron-httpd" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.955972 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="60ec54d0-c985-48d2-b081-c1082d132e65" containerName="neutron-httpd" Feb 17 09:23:03 crc kubenswrapper[4848]: E0217 09:23:03.955985 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e88c051-4096-47e8-b941-4cee5e3971ef" containerName="mariadb-account-create-update" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.955992 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e88c051-4096-47e8-b941-4cee5e3971ef" containerName="mariadb-account-create-update" Feb 17 09:23:03 crc kubenswrapper[4848]: E0217 09:23:03.956009 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa12092e-4e11-4aa1-a495-383d39eb7806" containerName="mariadb-account-create-update" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.956019 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa12092e-4e11-4aa1-a495-383d39eb7806" containerName="mariadb-account-create-update" Feb 17 09:23:03 crc kubenswrapper[4848]: E0217 09:23:03.956033 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81fa19e-dc1a-4f09-8f05-b28099ae5f03" containerName="glance-httpd" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.956042 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81fa19e-dc1a-4f09-8f05-b28099ae5f03" containerName="glance-httpd" Feb 17 09:23:03 crc kubenswrapper[4848]: E0217 09:23:03.956055 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81fa19e-dc1a-4f09-8f05-b28099ae5f03" containerName="glance-log" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.956063 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81fa19e-dc1a-4f09-8f05-b28099ae5f03" containerName="glance-log" Feb 17 09:23:03 crc kubenswrapper[4848]: E0217 09:23:03.956077 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60ec54d0-c985-48d2-b081-c1082d132e65" containerName="neutron-api" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.956085 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="60ec54d0-c985-48d2-b081-c1082d132e65" containerName="neutron-api" Feb 17 09:23:03 crc kubenswrapper[4848]: E0217 09:23:03.956108 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dc6db25-7a23-453e-b3e5-331167e6d51e" containerName="mariadb-database-create" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.956116 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc6db25-7a23-453e-b3e5-331167e6d51e" containerName="mariadb-database-create" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.957413 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="1068aa99-55d4-4778-ac02-b354de25d16e" containerName="horizon-log" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.957441 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="60ec54d0-c985-48d2-b081-c1082d132e65" containerName="neutron-api" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.957455 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e88c051-4096-47e8-b941-4cee5e3971ef" containerName="mariadb-account-create-update" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.957469 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e349818-833c-41bd-83dd-00ca201c100c" containerName="mariadb-database-create" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.957480 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="60ec54d0-c985-48d2-b081-c1082d132e65" containerName="neutron-httpd" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.957491 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ef72abf-423d-4df8-8d07-f2340b53ddff" containerName="mariadb-database-create" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.957503 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa12092e-4e11-4aa1-a495-383d39eb7806" containerName="mariadb-account-create-update" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.957518 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="e81fa19e-dc1a-4f09-8f05-b28099ae5f03" containerName="glance-httpd" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.957526 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="e81fa19e-dc1a-4f09-8f05-b28099ae5f03" containerName="glance-log" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.957540 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dc6db25-7a23-453e-b3e5-331167e6d51e" containerName="mariadb-database-create" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.957551 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0" containerName="mariadb-account-create-update" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.957564 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="1068aa99-55d4-4778-ac02-b354de25d16e" containerName="horizon" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.958668 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.967324 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 17 09:23:03 crc kubenswrapper[4848]: I0217 09:23:03.968703 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:03.997984 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.042701 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379\") " pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.042789 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379-config-data\") pod \"glance-default-external-api-0\" (UID: \"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379\") " pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.042841 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379\") " pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.042871 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379-scripts\") pod \"glance-default-external-api-0\" (UID: \"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379\") " pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.042892 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379\") " pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.042940 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379\") " pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.043019 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwrg5\" (UniqueName: \"kubernetes.io/projected/ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379-kube-api-access-nwrg5\") pod \"glance-default-external-api-0\" (UID: \"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379\") " pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.043085 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379-logs\") pod \"glance-default-external-api-0\" (UID: \"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379\") " pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.055713 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5js8t"] Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.056972 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5js8t" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.066544 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.066834 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.066933 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vcxc2" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.084357 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5js8t"] Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.145182 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379\") " pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.145237 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/910f96d1-b14a-49ea-9153-1fb90774711d-config-data\") pod \"nova-cell0-conductor-db-sync-5js8t\" (UID: \"910f96d1-b14a-49ea-9153-1fb90774711d\") " pod="openstack/nova-cell0-conductor-db-sync-5js8t" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.145253 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/910f96d1-b14a-49ea-9153-1fb90774711d-scripts\") pod \"nova-cell0-conductor-db-sync-5js8t\" (UID: \"910f96d1-b14a-49ea-9153-1fb90774711d\") " pod="openstack/nova-cell0-conductor-db-sync-5js8t" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.145284 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwrg5\" (UniqueName: \"kubernetes.io/projected/ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379-kube-api-access-nwrg5\") pod \"glance-default-external-api-0\" (UID: \"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379\") " pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.145324 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379-logs\") pod \"glance-default-external-api-0\" (UID: \"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379\") " pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.145351 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cr4g\" (UniqueName: \"kubernetes.io/projected/910f96d1-b14a-49ea-9153-1fb90774711d-kube-api-access-4cr4g\") pod \"nova-cell0-conductor-db-sync-5js8t\" (UID: \"910f96d1-b14a-49ea-9153-1fb90774711d\") " pod="openstack/nova-cell0-conductor-db-sync-5js8t" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.145390 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379\") " pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.145412 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379-config-data\") pod \"glance-default-external-api-0\" (UID: \"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379\") " pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.145436 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/910f96d1-b14a-49ea-9153-1fb90774711d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5js8t\" (UID: \"910f96d1-b14a-49ea-9153-1fb90774711d\") " pod="openstack/nova-cell0-conductor-db-sync-5js8t" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.145460 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379\") " pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.145479 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379-scripts\") pod \"glance-default-external-api-0\" (UID: \"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379\") " pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.145493 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379\") " pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.148451 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.152466 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379-logs\") pod \"glance-default-external-api-0\" (UID: \"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379\") " pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.152738 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379\") " pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.154549 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379\") " pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.161135 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379-config-data\") pod \"glance-default-external-api-0\" (UID: \"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379\") " pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.167570 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379\") " pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.182580 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379-scripts\") pod \"glance-default-external-api-0\" (UID: \"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379\") " pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.187846 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwrg5\" (UniqueName: \"kubernetes.io/projected/ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379-kube-api-access-nwrg5\") pod \"glance-default-external-api-0\" (UID: \"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379\") " pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.243899 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-0\" (UID: \"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379\") " pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.247827 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/910f96d1-b14a-49ea-9153-1fb90774711d-config-data\") pod \"nova-cell0-conductor-db-sync-5js8t\" (UID: \"910f96d1-b14a-49ea-9153-1fb90774711d\") " pod="openstack/nova-cell0-conductor-db-sync-5js8t" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.247888 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/910f96d1-b14a-49ea-9153-1fb90774711d-scripts\") pod \"nova-cell0-conductor-db-sync-5js8t\" (UID: \"910f96d1-b14a-49ea-9153-1fb90774711d\") " pod="openstack/nova-cell0-conductor-db-sync-5js8t" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.247960 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cr4g\" (UniqueName: \"kubernetes.io/projected/910f96d1-b14a-49ea-9153-1fb90774711d-kube-api-access-4cr4g\") pod \"nova-cell0-conductor-db-sync-5js8t\" (UID: \"910f96d1-b14a-49ea-9153-1fb90774711d\") " pod="openstack/nova-cell0-conductor-db-sync-5js8t" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.248013 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/910f96d1-b14a-49ea-9153-1fb90774711d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5js8t\" (UID: \"910f96d1-b14a-49ea-9153-1fb90774711d\") " pod="openstack/nova-cell0-conductor-db-sync-5js8t" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.251021 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/910f96d1-b14a-49ea-9153-1fb90774711d-config-data\") pod \"nova-cell0-conductor-db-sync-5js8t\" (UID: \"910f96d1-b14a-49ea-9153-1fb90774711d\") " pod="openstack/nova-cell0-conductor-db-sync-5js8t" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.252913 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/910f96d1-b14a-49ea-9153-1fb90774711d-scripts\") pod \"nova-cell0-conductor-db-sync-5js8t\" (UID: \"910f96d1-b14a-49ea-9153-1fb90774711d\") " pod="openstack/nova-cell0-conductor-db-sync-5js8t" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.254336 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/910f96d1-b14a-49ea-9153-1fb90774711d-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-5js8t\" (UID: \"910f96d1-b14a-49ea-9153-1fb90774711d\") " pod="openstack/nova-cell0-conductor-db-sync-5js8t" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.268273 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cr4g\" (UniqueName: \"kubernetes.io/projected/910f96d1-b14a-49ea-9153-1fb90774711d-kube-api-access-4cr4g\") pod \"nova-cell0-conductor-db-sync-5js8t\" (UID: \"910f96d1-b14a-49ea-9153-1fb90774711d\") " pod="openstack/nova-cell0-conductor-db-sync-5js8t" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.295176 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.324522 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.348549 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-logs\") pod \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.348768 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-scripts\") pod \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.348830 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph4q7\" (UniqueName: \"kubernetes.io/projected/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-kube-api-access-ph4q7\") pod \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.348895 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-config-data\") pod \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.348924 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-httpd-run\") pod \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.348971 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-internal-tls-certs\") pod \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.348997 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.349021 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-combined-ca-bundle\") pod \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\" (UID: \"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4\") " Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.350133 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4" (UID: "2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.350912 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-logs" (OuterVolumeSpecName: "logs") pod "2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4" (UID: "2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.359920 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-scripts" (OuterVolumeSpecName: "scripts") pod "2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4" (UID: "2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.359993 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4" (UID: "2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.368920 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-kube-api-access-ph4q7" (OuterVolumeSpecName: "kube-api-access-ph4q7") pod "2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4" (UID: "2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4"). InnerVolumeSpecName "kube-api-access-ph4q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.399574 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5js8t" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.441951 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4" (UID: "2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.456190 4848 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.456264 4848 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.456275 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.456285 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-logs\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.456294 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.456302 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph4q7\" (UniqueName: \"kubernetes.io/projected/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-kube-api-access-ph4q7\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.459909 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-config-data" (OuterVolumeSpecName: "config-data") pod "2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4" (UID: "2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.469903 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4" (UID: "2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.487941 4848 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.560864 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.561185 4848 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.561197 4848 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.773787 4848 generic.go:334] "Generic (PLEG): container finished" podID="2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4" containerID="7df204035a6f669c198c02047c8cbb1aabf451d20bed321fab8885c38c9f4247" exitCode=0 Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.773844 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4","Type":"ContainerDied","Data":"7df204035a6f669c198c02047c8cbb1aabf451d20bed321fab8885c38c9f4247"} Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.773869 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4","Type":"ContainerDied","Data":"fa5eabefbf28961d54f145f1b1ba5242223520cf7aa651859a2a86b19042bad2"} Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.773890 4848 scope.go:117] "RemoveContainer" containerID="7df204035a6f669c198c02047c8cbb1aabf451d20bed321fab8885c38c9f4247" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.773985 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.819312 4848 scope.go:117] "RemoveContainer" containerID="05231b5d6f5ec5cbc776f080a0317e40903d96f52f99d2628752bfa1c939efc7" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.845544 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.854821 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.855179 4848 scope.go:117] "RemoveContainer" containerID="7df204035a6f669c198c02047c8cbb1aabf451d20bed321fab8885c38c9f4247" Feb 17 09:23:04 crc kubenswrapper[4848]: E0217 09:23:04.855678 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7df204035a6f669c198c02047c8cbb1aabf451d20bed321fab8885c38c9f4247\": container with ID starting with 7df204035a6f669c198c02047c8cbb1aabf451d20bed321fab8885c38c9f4247 not found: ID does not exist" containerID="7df204035a6f669c198c02047c8cbb1aabf451d20bed321fab8885c38c9f4247" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.855719 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7df204035a6f669c198c02047c8cbb1aabf451d20bed321fab8885c38c9f4247"} err="failed to get container status \"7df204035a6f669c198c02047c8cbb1aabf451d20bed321fab8885c38c9f4247\": rpc error: code = NotFound desc = could not find container \"7df204035a6f669c198c02047c8cbb1aabf451d20bed321fab8885c38c9f4247\": container with ID starting with 7df204035a6f669c198c02047c8cbb1aabf451d20bed321fab8885c38c9f4247 not found: ID does not exist" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.855751 4848 scope.go:117] "RemoveContainer" containerID="05231b5d6f5ec5cbc776f080a0317e40903d96f52f99d2628752bfa1c939efc7" Feb 17 09:23:04 crc kubenswrapper[4848]: E0217 09:23:04.856099 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05231b5d6f5ec5cbc776f080a0317e40903d96f52f99d2628752bfa1c939efc7\": container with ID starting with 05231b5d6f5ec5cbc776f080a0317e40903d96f52f99d2628752bfa1c939efc7 not found: ID does not exist" containerID="05231b5d6f5ec5cbc776f080a0317e40903d96f52f99d2628752bfa1c939efc7" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.856121 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05231b5d6f5ec5cbc776f080a0317e40903d96f52f99d2628752bfa1c939efc7"} err="failed to get container status \"05231b5d6f5ec5cbc776f080a0317e40903d96f52f99d2628752bfa1c939efc7\": rpc error: code = NotFound desc = could not find container \"05231b5d6f5ec5cbc776f080a0317e40903d96f52f99d2628752bfa1c939efc7\": container with ID starting with 05231b5d6f5ec5cbc776f080a0317e40903d96f52f99d2628752bfa1c939efc7 not found: ID does not exist" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.868603 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 09:23:04 crc kubenswrapper[4848]: E0217 09:23:04.869125 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4" containerName="glance-httpd" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.869150 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4" containerName="glance-httpd" Feb 17 09:23:04 crc kubenswrapper[4848]: E0217 09:23:04.869177 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4" containerName="glance-log" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.869185 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4" containerName="glance-log" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.869396 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4" containerName="glance-log" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.869421 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4" containerName="glance-httpd" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.870541 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.872392 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.873113 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 17 09:23:04 crc kubenswrapper[4848]: I0217 09:23:04.884920 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.072693 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f89c214-b934-465f-86ee-dec5f742237e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f89c214-b934-465f-86ee-dec5f742237e\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.072777 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f89c214-b934-465f-86ee-dec5f742237e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f89c214-b934-465f-86ee-dec5f742237e\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.072862 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f89c214-b934-465f-86ee-dec5f742237e\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.072911 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f89c214-b934-465f-86ee-dec5f742237e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f89c214-b934-465f-86ee-dec5f742237e\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.072953 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f89c214-b934-465f-86ee-dec5f742237e-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f89c214-b934-465f-86ee-dec5f742237e\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.072976 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f89c214-b934-465f-86ee-dec5f742237e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f89c214-b934-465f-86ee-dec5f742237e\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.073043 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glcch\" (UniqueName: \"kubernetes.io/projected/5f89c214-b934-465f-86ee-dec5f742237e-kube-api-access-glcch\") pod \"glance-default-internal-api-0\" (UID: \"5f89c214-b934-465f-86ee-dec5f742237e\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.073071 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f89c214-b934-465f-86ee-dec5f742237e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f89c214-b934-465f-86ee-dec5f742237e\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: W0217 09:23:05.158837 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebc27fb3_ae13_4cc0_814d_ce4b3ecf8379.slice/crio-5bd0fbf2e59f3c335397aefe3cdc0bed7db21023b2f4b5acaa6bdbe28f5ce6be WatchSource:0}: Error finding container 5bd0fbf2e59f3c335397aefe3cdc0bed7db21023b2f4b5acaa6bdbe28f5ce6be: Status 404 returned error can't find the container with id 5bd0fbf2e59f3c335397aefe3cdc0bed7db21023b2f4b5acaa6bdbe28f5ce6be Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.160496 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.174948 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f89c214-b934-465f-86ee-dec5f742237e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f89c214-b934-465f-86ee-dec5f742237e\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.175004 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f89c214-b934-465f-86ee-dec5f742237e\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.175031 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f89c214-b934-465f-86ee-dec5f742237e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f89c214-b934-465f-86ee-dec5f742237e\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.175065 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f89c214-b934-465f-86ee-dec5f742237e-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f89c214-b934-465f-86ee-dec5f742237e\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.175082 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f89c214-b934-465f-86ee-dec5f742237e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f89c214-b934-465f-86ee-dec5f742237e\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.175132 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glcch\" (UniqueName: \"kubernetes.io/projected/5f89c214-b934-465f-86ee-dec5f742237e-kube-api-access-glcch\") pod \"glance-default-internal-api-0\" (UID: \"5f89c214-b934-465f-86ee-dec5f742237e\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.175152 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f89c214-b934-465f-86ee-dec5f742237e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f89c214-b934-465f-86ee-dec5f742237e\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.175193 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f89c214-b934-465f-86ee-dec5f742237e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f89c214-b934-465f-86ee-dec5f742237e\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.175501 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f89c214-b934-465f-86ee-dec5f742237e\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.175708 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5f89c214-b934-465f-86ee-dec5f742237e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5f89c214-b934-465f-86ee-dec5f742237e\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.176238 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5f89c214-b934-465f-86ee-dec5f742237e-logs\") pod \"glance-default-internal-api-0\" (UID: \"5f89c214-b934-465f-86ee-dec5f742237e\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.181969 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f89c214-b934-465f-86ee-dec5f742237e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5f89c214-b934-465f-86ee-dec5f742237e\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.182378 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5f89c214-b934-465f-86ee-dec5f742237e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"5f89c214-b934-465f-86ee-dec5f742237e\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.186616 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f89c214-b934-465f-86ee-dec5f742237e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5f89c214-b934-465f-86ee-dec5f742237e\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.192479 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f89c214-b934-465f-86ee-dec5f742237e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5f89c214-b934-465f-86ee-dec5f742237e\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.202514 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glcch\" (UniqueName: \"kubernetes.io/projected/5f89c214-b934-465f-86ee-dec5f742237e-kube-api-access-glcch\") pod \"glance-default-internal-api-0\" (UID: \"5f89c214-b934-465f-86ee-dec5f742237e\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.227416 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-0\" (UID: \"5f89c214-b934-465f-86ee-dec5f742237e\") " pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.287342 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5js8t"] Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.398334 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4" path="/var/lib/kubelet/pods/2d6a70de-d4e3-43a1-b4c8-949a4dcc0fb4/volumes" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.399191 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e81fa19e-dc1a-4f09-8f05-b28099ae5f03" path="/var/lib/kubelet/pods/e81fa19e-dc1a-4f09-8f05-b28099ae5f03/volumes" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.487984 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.807007 4848 generic.go:334] "Generic (PLEG): container finished" podID="906e1770-f9be-4539-8bb9-443d30a30691" containerID="935e5d1283ea3c059f84d265e9004100f52bb8f7129d2c50479ea50f7b3dbcef" exitCode=0 Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.807097 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906e1770-f9be-4539-8bb9-443d30a30691","Type":"ContainerDied","Data":"935e5d1283ea3c059f84d265e9004100f52bb8f7129d2c50479ea50f7b3dbcef"} Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.811055 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379","Type":"ContainerStarted","Data":"5bd0fbf2e59f3c335397aefe3cdc0bed7db21023b2f4b5acaa6bdbe28f5ce6be"} Feb 17 09:23:05 crc kubenswrapper[4848]: I0217 09:23:05.811930 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5js8t" event={"ID":"910f96d1-b14a-49ea-9153-1fb90774711d","Type":"ContainerStarted","Data":"d3934c403af1fd43f42fb7f1d946719959e4a62fefb0ee34da03fca7662b51db"} Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.083421 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 17 09:23:06 crc kubenswrapper[4848]: W0217 09:23:06.083989 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f89c214_b934_465f_86ee_dec5f742237e.slice/crio-b61f52aad9e01aefd3a9133b65b45f4692d8f0ac436d89b8c228bacb453261fb WatchSource:0}: Error finding container b61f52aad9e01aefd3a9133b65b45f4692d8f0ac436d89b8c228bacb453261fb: Status 404 returned error can't find the container with id b61f52aad9e01aefd3a9133b65b45f4692d8f0ac436d89b8c228bacb453261fb Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.107414 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.201593 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/906e1770-f9be-4539-8bb9-443d30a30691-config-data\") pod \"906e1770-f9be-4539-8bb9-443d30a30691\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.201650 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/906e1770-f9be-4539-8bb9-443d30a30691-sg-core-conf-yaml\") pod \"906e1770-f9be-4539-8bb9-443d30a30691\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.201704 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906e1770-f9be-4539-8bb9-443d30a30691-run-httpd\") pod \"906e1770-f9be-4539-8bb9-443d30a30691\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.201772 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/906e1770-f9be-4539-8bb9-443d30a30691-scripts\") pod \"906e1770-f9be-4539-8bb9-443d30a30691\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.201795 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906e1770-f9be-4539-8bb9-443d30a30691-log-httpd\") pod \"906e1770-f9be-4539-8bb9-443d30a30691\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.201848 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/906e1770-f9be-4539-8bb9-443d30a30691-combined-ca-bundle\") pod \"906e1770-f9be-4539-8bb9-443d30a30691\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.201869 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4t5pk\" (UniqueName: \"kubernetes.io/projected/906e1770-f9be-4539-8bb9-443d30a30691-kube-api-access-4t5pk\") pod \"906e1770-f9be-4539-8bb9-443d30a30691\" (UID: \"906e1770-f9be-4539-8bb9-443d30a30691\") " Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.202396 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/906e1770-f9be-4539-8bb9-443d30a30691-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "906e1770-f9be-4539-8bb9-443d30a30691" (UID: "906e1770-f9be-4539-8bb9-443d30a30691"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.203488 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/906e1770-f9be-4539-8bb9-443d30a30691-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "906e1770-f9be-4539-8bb9-443d30a30691" (UID: "906e1770-f9be-4539-8bb9-443d30a30691"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.207429 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/906e1770-f9be-4539-8bb9-443d30a30691-kube-api-access-4t5pk" (OuterVolumeSpecName: "kube-api-access-4t5pk") pod "906e1770-f9be-4539-8bb9-443d30a30691" (UID: "906e1770-f9be-4539-8bb9-443d30a30691"). InnerVolumeSpecName "kube-api-access-4t5pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.227661 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/906e1770-f9be-4539-8bb9-443d30a30691-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "906e1770-f9be-4539-8bb9-443d30a30691" (UID: "906e1770-f9be-4539-8bb9-443d30a30691"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.228879 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/906e1770-f9be-4539-8bb9-443d30a30691-scripts" (OuterVolumeSpecName: "scripts") pod "906e1770-f9be-4539-8bb9-443d30a30691" (UID: "906e1770-f9be-4539-8bb9-443d30a30691"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.283222 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/906e1770-f9be-4539-8bb9-443d30a30691-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "906e1770-f9be-4539-8bb9-443d30a30691" (UID: "906e1770-f9be-4539-8bb9-443d30a30691"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.301903 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/906e1770-f9be-4539-8bb9-443d30a30691-config-data" (OuterVolumeSpecName: "config-data") pod "906e1770-f9be-4539-8bb9-443d30a30691" (UID: "906e1770-f9be-4539-8bb9-443d30a30691"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.303512 4848 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/906e1770-f9be-4539-8bb9-443d30a30691-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.303538 4848 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906e1770-f9be-4539-8bb9-443d30a30691-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.303550 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/906e1770-f9be-4539-8bb9-443d30a30691-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.303558 4848 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/906e1770-f9be-4539-8bb9-443d30a30691-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.303567 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/906e1770-f9be-4539-8bb9-443d30a30691-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.303575 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4t5pk\" (UniqueName: \"kubernetes.io/projected/906e1770-f9be-4539-8bb9-443d30a30691-kube-api-access-4t5pk\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.303585 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/906e1770-f9be-4539-8bb9-443d30a30691-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.834261 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f89c214-b934-465f-86ee-dec5f742237e","Type":"ContainerStarted","Data":"2e48a8b66b6161c5643f1d3d3118cf547cfb4b8617bb7a7ac58fe3d0190f7d4a"} Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.834737 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f89c214-b934-465f-86ee-dec5f742237e","Type":"ContainerStarted","Data":"b61f52aad9e01aefd3a9133b65b45f4692d8f0ac436d89b8c228bacb453261fb"} Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.842536 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"906e1770-f9be-4539-8bb9-443d30a30691","Type":"ContainerDied","Data":"2ab9f4454bc844aa29163f9b6ce9c795b53fae8a61fbe5478160c18ce415d141"} Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.842578 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.842632 4848 scope.go:117] "RemoveContainer" containerID="45430d78172ca5d50a6dc833341639743bbc6f8c79cee072e40ca09cff63e39b" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.846557 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379","Type":"ContainerStarted","Data":"62a37929b4a2f508c9754d42e2c6ec8b6cef4abb5de1f590af1d617ff760a7d7"} Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.846606 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379","Type":"ContainerStarted","Data":"893ea5e103113cbbb8b03c4320fc10da9eab2893d98b1289346c6afd52ed8d29"} Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.870030 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.8700085509999997 podStartE2EDuration="3.870008551s" podCreationTimestamp="2026-02-17 09:23:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:23:06.866416834 +0000 UTC m=+1064.409672480" watchObservedRunningTime="2026-02-17 09:23:06.870008551 +0000 UTC m=+1064.413264197" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.909969 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.911474 4848 scope.go:117] "RemoveContainer" containerID="f79528e18c5cc8f692fa67e79a1867e949b33973f383b19bde58125138a2d245" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.931124 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.954018 4848 scope.go:117] "RemoveContainer" containerID="98a1a68593f8f7629947e05ce2026f56af30a7867f1f35f8f0c4b0c3faf2092c" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.957313 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:23:06 crc kubenswrapper[4848]: E0217 09:23:06.958153 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906e1770-f9be-4539-8bb9-443d30a30691" containerName="sg-core" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.958180 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="906e1770-f9be-4539-8bb9-443d30a30691" containerName="sg-core" Feb 17 09:23:06 crc kubenswrapper[4848]: E0217 09:23:06.958195 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906e1770-f9be-4539-8bb9-443d30a30691" containerName="ceilometer-central-agent" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.958205 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="906e1770-f9be-4539-8bb9-443d30a30691" containerName="ceilometer-central-agent" Feb 17 09:23:06 crc kubenswrapper[4848]: E0217 09:23:06.958224 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906e1770-f9be-4539-8bb9-443d30a30691" containerName="ceilometer-notification-agent" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.958232 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="906e1770-f9be-4539-8bb9-443d30a30691" containerName="ceilometer-notification-agent" Feb 17 09:23:06 crc kubenswrapper[4848]: E0217 09:23:06.958253 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906e1770-f9be-4539-8bb9-443d30a30691" containerName="proxy-httpd" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.958261 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="906e1770-f9be-4539-8bb9-443d30a30691" containerName="proxy-httpd" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.958454 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="906e1770-f9be-4539-8bb9-443d30a30691" containerName="proxy-httpd" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.958483 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="906e1770-f9be-4539-8bb9-443d30a30691" containerName="ceilometer-central-agent" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.958511 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="906e1770-f9be-4539-8bb9-443d30a30691" containerName="ceilometer-notification-agent" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.958527 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="906e1770-f9be-4539-8bb9-443d30a30691" containerName="sg-core" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.960435 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.962950 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.969008 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.972270 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:23:06 crc kubenswrapper[4848]: I0217 09:23:06.988327 4848 scope.go:117] "RemoveContainer" containerID="935e5d1283ea3c059f84d265e9004100f52bb8f7129d2c50479ea50f7b3dbcef" Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.117677 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38727c59-b0b5-4d12-b88e-527cd503359e-log-httpd\") pod \"ceilometer-0\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " pod="openstack/ceilometer-0" Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.117740 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38727c59-b0b5-4d12-b88e-527cd503359e-scripts\") pod \"ceilometer-0\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " pod="openstack/ceilometer-0" Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.117834 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x26p\" (UniqueName: \"kubernetes.io/projected/38727c59-b0b5-4d12-b88e-527cd503359e-kube-api-access-5x26p\") pod \"ceilometer-0\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " pod="openstack/ceilometer-0" Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.117867 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38727c59-b0b5-4d12-b88e-527cd503359e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " pod="openstack/ceilometer-0" Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.117883 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38727c59-b0b5-4d12-b88e-527cd503359e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " pod="openstack/ceilometer-0" Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.117899 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38727c59-b0b5-4d12-b88e-527cd503359e-config-data\") pod \"ceilometer-0\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " pod="openstack/ceilometer-0" Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.117927 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38727c59-b0b5-4d12-b88e-527cd503359e-run-httpd\") pod \"ceilometer-0\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " pod="openstack/ceilometer-0" Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.168303 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:23:07 crc kubenswrapper[4848]: E0217 09:23:07.168910 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-5x26p log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="38727c59-b0b5-4d12-b88e-527cd503359e" Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.219854 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38727c59-b0b5-4d12-b88e-527cd503359e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " pod="openstack/ceilometer-0" Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.219891 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38727c59-b0b5-4d12-b88e-527cd503359e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " pod="openstack/ceilometer-0" Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.219910 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38727c59-b0b5-4d12-b88e-527cd503359e-config-data\") pod \"ceilometer-0\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " pod="openstack/ceilometer-0" Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.219942 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38727c59-b0b5-4d12-b88e-527cd503359e-run-httpd\") pod \"ceilometer-0\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " pod="openstack/ceilometer-0" Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.220024 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38727c59-b0b5-4d12-b88e-527cd503359e-log-httpd\") pod \"ceilometer-0\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " pod="openstack/ceilometer-0" Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.220059 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38727c59-b0b5-4d12-b88e-527cd503359e-scripts\") pod \"ceilometer-0\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " pod="openstack/ceilometer-0" Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.220088 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x26p\" (UniqueName: \"kubernetes.io/projected/38727c59-b0b5-4d12-b88e-527cd503359e-kube-api-access-5x26p\") pod \"ceilometer-0\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " pod="openstack/ceilometer-0" Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.220818 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38727c59-b0b5-4d12-b88e-527cd503359e-run-httpd\") pod \"ceilometer-0\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " pod="openstack/ceilometer-0" Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.221057 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38727c59-b0b5-4d12-b88e-527cd503359e-log-httpd\") pod \"ceilometer-0\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " pod="openstack/ceilometer-0" Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.224031 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38727c59-b0b5-4d12-b88e-527cd503359e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " pod="openstack/ceilometer-0" Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.224593 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38727c59-b0b5-4d12-b88e-527cd503359e-config-data\") pod \"ceilometer-0\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " pod="openstack/ceilometer-0" Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.224785 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38727c59-b0b5-4d12-b88e-527cd503359e-scripts\") pod \"ceilometer-0\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " pod="openstack/ceilometer-0" Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.237449 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x26p\" (UniqueName: \"kubernetes.io/projected/38727c59-b0b5-4d12-b88e-527cd503359e-kube-api-access-5x26p\") pod \"ceilometer-0\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " pod="openstack/ceilometer-0" Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.242268 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38727c59-b0b5-4d12-b88e-527cd503359e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " pod="openstack/ceilometer-0" Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.394519 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="906e1770-f9be-4539-8bb9-443d30a30691" path="/var/lib/kubelet/pods/906e1770-f9be-4539-8bb9-443d30a30691/volumes" Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.866744 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5f89c214-b934-465f-86ee-dec5f742237e","Type":"ContainerStarted","Data":"cba1907ca4ad78b11fa30e0b29b7e61b59709ebe8f2025e904cdbfe7542bc5c4"} Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.866846 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:23:07 crc kubenswrapper[4848]: I0217 09:23:07.884660 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.035158 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38727c59-b0b5-4d12-b88e-527cd503359e-log-httpd\") pod \"38727c59-b0b5-4d12-b88e-527cd503359e\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.035267 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x26p\" (UniqueName: \"kubernetes.io/projected/38727c59-b0b5-4d12-b88e-527cd503359e-kube-api-access-5x26p\") pod \"38727c59-b0b5-4d12-b88e-527cd503359e\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.035287 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38727c59-b0b5-4d12-b88e-527cd503359e-config-data\") pod \"38727c59-b0b5-4d12-b88e-527cd503359e\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.035354 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38727c59-b0b5-4d12-b88e-527cd503359e-scripts\") pod \"38727c59-b0b5-4d12-b88e-527cd503359e\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.035375 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38727c59-b0b5-4d12-b88e-527cd503359e-combined-ca-bundle\") pod \"38727c59-b0b5-4d12-b88e-527cd503359e\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.035454 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38727c59-b0b5-4d12-b88e-527cd503359e-sg-core-conf-yaml\") pod \"38727c59-b0b5-4d12-b88e-527cd503359e\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.035513 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38727c59-b0b5-4d12-b88e-527cd503359e-run-httpd\") pod \"38727c59-b0b5-4d12-b88e-527cd503359e\" (UID: \"38727c59-b0b5-4d12-b88e-527cd503359e\") " Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.036997 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38727c59-b0b5-4d12-b88e-527cd503359e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "38727c59-b0b5-4d12-b88e-527cd503359e" (UID: "38727c59-b0b5-4d12-b88e-527cd503359e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.042030 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38727c59-b0b5-4d12-b88e-527cd503359e-config-data" (OuterVolumeSpecName: "config-data") pod "38727c59-b0b5-4d12-b88e-527cd503359e" (UID: "38727c59-b0b5-4d12-b88e-527cd503359e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.043617 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38727c59-b0b5-4d12-b88e-527cd503359e-kube-api-access-5x26p" (OuterVolumeSpecName: "kube-api-access-5x26p") pod "38727c59-b0b5-4d12-b88e-527cd503359e" (UID: "38727c59-b0b5-4d12-b88e-527cd503359e"). InnerVolumeSpecName "kube-api-access-5x26p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.043859 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38727c59-b0b5-4d12-b88e-527cd503359e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "38727c59-b0b5-4d12-b88e-527cd503359e" (UID: "38727c59-b0b5-4d12-b88e-527cd503359e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.045306 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38727c59-b0b5-4d12-b88e-527cd503359e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "38727c59-b0b5-4d12-b88e-527cd503359e" (UID: "38727c59-b0b5-4d12-b88e-527cd503359e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.046883 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38727c59-b0b5-4d12-b88e-527cd503359e-scripts" (OuterVolumeSpecName: "scripts") pod "38727c59-b0b5-4d12-b88e-527cd503359e" (UID: "38727c59-b0b5-4d12-b88e-527cd503359e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.046977 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38727c59-b0b5-4d12-b88e-527cd503359e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "38727c59-b0b5-4d12-b88e-527cd503359e" (UID: "38727c59-b0b5-4d12-b88e-527cd503359e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.137835 4848 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38727c59-b0b5-4d12-b88e-527cd503359e-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.137872 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x26p\" (UniqueName: \"kubernetes.io/projected/38727c59-b0b5-4d12-b88e-527cd503359e-kube-api-access-5x26p\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.137886 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/38727c59-b0b5-4d12-b88e-527cd503359e-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.137896 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/38727c59-b0b5-4d12-b88e-527cd503359e-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.137904 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/38727c59-b0b5-4d12-b88e-527cd503359e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.137913 4848 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/38727c59-b0b5-4d12-b88e-527cd503359e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.137921 4848 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/38727c59-b0b5-4d12-b88e-527cd503359e-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.875035 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.908541 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.908522401 podStartE2EDuration="4.908522401s" podCreationTimestamp="2026-02-17 09:23:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:23:07.893443915 +0000 UTC m=+1065.436699561" watchObservedRunningTime="2026-02-17 09:23:08.908522401 +0000 UTC m=+1066.451778067" Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.924849 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.931377 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.954675 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.956553 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:23:08 crc kubenswrapper[4848]: I0217 09:23:08.962232 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:23:09 crc kubenswrapper[4848]: I0217 09:23:09.016529 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 09:23:09 crc kubenswrapper[4848]: I0217 09:23:09.016531 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 09:23:09 crc kubenswrapper[4848]: I0217 09:23:09.050891 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/caa30b8c-e57d-4b80-acb0-232c056de94e-run-httpd\") pod \"ceilometer-0\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " pod="openstack/ceilometer-0" Feb 17 09:23:09 crc kubenswrapper[4848]: I0217 09:23:09.050941 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/caa30b8c-e57d-4b80-acb0-232c056de94e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " pod="openstack/ceilometer-0" Feb 17 09:23:09 crc kubenswrapper[4848]: I0217 09:23:09.050977 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/caa30b8c-e57d-4b80-acb0-232c056de94e-scripts\") pod \"ceilometer-0\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " pod="openstack/ceilometer-0" Feb 17 09:23:09 crc kubenswrapper[4848]: I0217 09:23:09.051003 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/caa30b8c-e57d-4b80-acb0-232c056de94e-log-httpd\") pod \"ceilometer-0\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " pod="openstack/ceilometer-0" Feb 17 09:23:09 crc kubenswrapper[4848]: I0217 09:23:09.051060 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caa30b8c-e57d-4b80-acb0-232c056de94e-config-data\") pod \"ceilometer-0\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " pod="openstack/ceilometer-0" Feb 17 09:23:09 crc kubenswrapper[4848]: I0217 09:23:09.051083 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvh52\" (UniqueName: \"kubernetes.io/projected/caa30b8c-e57d-4b80-acb0-232c056de94e-kube-api-access-gvh52\") pod \"ceilometer-0\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " pod="openstack/ceilometer-0" Feb 17 09:23:09 crc kubenswrapper[4848]: I0217 09:23:09.051147 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caa30b8c-e57d-4b80-acb0-232c056de94e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " pod="openstack/ceilometer-0" Feb 17 09:23:09 crc kubenswrapper[4848]: I0217 09:23:09.152428 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/caa30b8c-e57d-4b80-acb0-232c056de94e-run-httpd\") pod \"ceilometer-0\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " pod="openstack/ceilometer-0" Feb 17 09:23:09 crc kubenswrapper[4848]: I0217 09:23:09.152485 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/caa30b8c-e57d-4b80-acb0-232c056de94e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " pod="openstack/ceilometer-0" Feb 17 09:23:09 crc kubenswrapper[4848]: I0217 09:23:09.152510 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/caa30b8c-e57d-4b80-acb0-232c056de94e-scripts\") pod \"ceilometer-0\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " pod="openstack/ceilometer-0" Feb 17 09:23:09 crc kubenswrapper[4848]: I0217 09:23:09.152535 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/caa30b8c-e57d-4b80-acb0-232c056de94e-log-httpd\") pod \"ceilometer-0\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " pod="openstack/ceilometer-0" Feb 17 09:23:09 crc kubenswrapper[4848]: I0217 09:23:09.152575 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caa30b8c-e57d-4b80-acb0-232c056de94e-config-data\") pod \"ceilometer-0\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " pod="openstack/ceilometer-0" Feb 17 09:23:09 crc kubenswrapper[4848]: I0217 09:23:09.152598 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvh52\" (UniqueName: \"kubernetes.io/projected/caa30b8c-e57d-4b80-acb0-232c056de94e-kube-api-access-gvh52\") pod \"ceilometer-0\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " pod="openstack/ceilometer-0" Feb 17 09:23:09 crc kubenswrapper[4848]: I0217 09:23:09.152657 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caa30b8c-e57d-4b80-acb0-232c056de94e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " pod="openstack/ceilometer-0" Feb 17 09:23:09 crc kubenswrapper[4848]: I0217 09:23:09.157004 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caa30b8c-e57d-4b80-acb0-232c056de94e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " pod="openstack/ceilometer-0" Feb 17 09:23:09 crc kubenswrapper[4848]: I0217 09:23:09.157310 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/caa30b8c-e57d-4b80-acb0-232c056de94e-log-httpd\") pod \"ceilometer-0\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " pod="openstack/ceilometer-0" Feb 17 09:23:09 crc kubenswrapper[4848]: I0217 09:23:09.157490 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/caa30b8c-e57d-4b80-acb0-232c056de94e-run-httpd\") pod \"ceilometer-0\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " pod="openstack/ceilometer-0" Feb 17 09:23:09 crc kubenswrapper[4848]: I0217 09:23:09.161137 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/caa30b8c-e57d-4b80-acb0-232c056de94e-scripts\") pod \"ceilometer-0\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " pod="openstack/ceilometer-0" Feb 17 09:23:09 crc kubenswrapper[4848]: I0217 09:23:09.161303 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caa30b8c-e57d-4b80-acb0-232c056de94e-config-data\") pod \"ceilometer-0\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " pod="openstack/ceilometer-0" Feb 17 09:23:09 crc kubenswrapper[4848]: I0217 09:23:09.170691 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/caa30b8c-e57d-4b80-acb0-232c056de94e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " pod="openstack/ceilometer-0" Feb 17 09:23:09 crc kubenswrapper[4848]: I0217 09:23:09.174664 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvh52\" (UniqueName: \"kubernetes.io/projected/caa30b8c-e57d-4b80-acb0-232c056de94e-kube-api-access-gvh52\") pod \"ceilometer-0\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " pod="openstack/ceilometer-0" Feb 17 09:23:09 crc kubenswrapper[4848]: I0217 09:23:09.328698 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:23:09 crc kubenswrapper[4848]: I0217 09:23:09.392573 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38727c59-b0b5-4d12-b88e-527cd503359e" path="/var/lib/kubelet/pods/38727c59-b0b5-4d12-b88e-527cd503359e/volumes" Feb 17 09:23:13 crc kubenswrapper[4848]: I0217 09:23:13.934484 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5js8t" event={"ID":"910f96d1-b14a-49ea-9153-1fb90774711d","Type":"ContainerStarted","Data":"9cd17a4bb61419d7e3274e5c1432e3ebe90d529a0e6040483e8cb4d1168587ee"} Feb 17 09:23:13 crc kubenswrapper[4848]: I0217 09:23:13.974788 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:23:13 crc kubenswrapper[4848]: I0217 09:23:13.981798 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-5js8t" podStartSLOduration=1.7292309590000001 podStartE2EDuration="9.981752271s" podCreationTimestamp="2026-02-17 09:23:04 +0000 UTC" firstStartedPulling="2026-02-17 09:23:05.293061842 +0000 UTC m=+1062.836317488" lastFinishedPulling="2026-02-17 09:23:13.545583154 +0000 UTC m=+1071.088838800" observedRunningTime="2026-02-17 09:23:13.964051773 +0000 UTC m=+1071.507307419" watchObservedRunningTime="2026-02-17 09:23:13.981752271 +0000 UTC m=+1071.525007937" Feb 17 09:23:14 crc kubenswrapper[4848]: I0217 09:23:14.295857 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 09:23:14 crc kubenswrapper[4848]: I0217 09:23:14.295920 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 17 09:23:14 crc kubenswrapper[4848]: I0217 09:23:14.340181 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 09:23:14 crc kubenswrapper[4848]: I0217 09:23:14.349306 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 17 09:23:14 crc kubenswrapper[4848]: I0217 09:23:14.946693 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"caa30b8c-e57d-4b80-acb0-232c056de94e","Type":"ContainerStarted","Data":"e3521e1b628b3af1778beb4199c92eb28e540b6197fa01a76a9812c8c23f4e6a"} Feb 17 09:23:14 crc kubenswrapper[4848]: I0217 09:23:14.947011 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"caa30b8c-e57d-4b80-acb0-232c056de94e","Type":"ContainerStarted","Data":"cd71920c861dac317c9385bb830dc809ccd71bef1ea7052710ade2e5243a364b"} Feb 17 09:23:14 crc kubenswrapper[4848]: I0217 09:23:14.947901 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 09:23:14 crc kubenswrapper[4848]: I0217 09:23:14.947930 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 17 09:23:16 crc kubenswrapper[4848]: I0217 09:23:15.550380 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 09:23:16 crc kubenswrapper[4848]: I0217 09:23:15.550409 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 17 09:23:16 crc kubenswrapper[4848]: I0217 09:23:15.590797 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 09:23:16 crc kubenswrapper[4848]: I0217 09:23:15.600925 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 17 09:23:16 crc kubenswrapper[4848]: I0217 09:23:15.957651 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 09:23:16 crc kubenswrapper[4848]: I0217 09:23:15.957687 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 17 09:23:16 crc kubenswrapper[4848]: I0217 09:23:16.542906 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-5db88f68c-7swkp" podUID="e72f9717-510f-4f9e-8557-ccd69b4dc61c" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.86:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 09:23:16 crc kubenswrapper[4848]: I0217 09:23:16.592116 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-6df4786bd-895gn" podUID="054a38ba-b80d-44df-b84a-e5e3b9847df3" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.45:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 09:23:16 crc kubenswrapper[4848]: I0217 09:23:16.964844 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 09:23:16 crc kubenswrapper[4848]: I0217 09:23:16.964878 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 09:23:17 crc kubenswrapper[4848]: I0217 09:23:17.973289 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 09:23:17 crc kubenswrapper[4848]: I0217 09:23:17.974193 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 09:23:17 crc kubenswrapper[4848]: I0217 09:23:17.975052 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"caa30b8c-e57d-4b80-acb0-232c056de94e","Type":"ContainerStarted","Data":"d367362ba96077b36b4bd9dec4a67ff2fb87c5a88e59d5b3f7a6ea89d71bf759"} Feb 17 09:23:17 crc kubenswrapper[4848]: I0217 09:23:17.975079 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"caa30b8c-e57d-4b80-acb0-232c056de94e","Type":"ContainerStarted","Data":"bb02958f3d5bcf25e322ca788b8297d5c5a42f78b848c9422900191b84e1d718"} Feb 17 09:23:17 crc kubenswrapper[4848]: I0217 09:23:17.975553 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 17 09:23:18 crc kubenswrapper[4848]: I0217 09:23:18.181740 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 09:23:18 crc kubenswrapper[4848]: I0217 09:23:18.182146 4848 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 17 09:23:18 crc kubenswrapper[4848]: I0217 09:23:18.183736 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 17 09:23:19 crc kubenswrapper[4848]: I0217 09:23:19.990293 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"caa30b8c-e57d-4b80-acb0-232c056de94e","Type":"ContainerStarted","Data":"1b3579f43dd32f27b8da514ac47c9e21222e4743b7db8b531a1df6d84e80c7ce"} Feb 17 09:23:19 crc kubenswrapper[4848]: I0217 09:23:19.990917 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 09:23:20 crc kubenswrapper[4848]: I0217 09:23:20.017024 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=7.132246984 podStartE2EDuration="12.017003785s" podCreationTimestamp="2026-02-17 09:23:08 +0000 UTC" firstStartedPulling="2026-02-17 09:23:13.964864695 +0000 UTC m=+1071.508120351" lastFinishedPulling="2026-02-17 09:23:18.849621506 +0000 UTC m=+1076.392877152" observedRunningTime="2026-02-17 09:23:20.008467275 +0000 UTC m=+1077.551722941" watchObservedRunningTime="2026-02-17 09:23:20.017003785 +0000 UTC m=+1077.560259441" Feb 17 09:23:21 crc kubenswrapper[4848]: I0217 09:23:21.462230 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:23:22 crc kubenswrapper[4848]: I0217 09:23:22.008236 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="caa30b8c-e57d-4b80-acb0-232c056de94e" containerName="ceilometer-central-agent" containerID="cri-o://e3521e1b628b3af1778beb4199c92eb28e540b6197fa01a76a9812c8c23f4e6a" gracePeriod=30 Feb 17 09:23:22 crc kubenswrapper[4848]: I0217 09:23:22.008418 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="caa30b8c-e57d-4b80-acb0-232c056de94e" containerName="proxy-httpd" containerID="cri-o://1b3579f43dd32f27b8da514ac47c9e21222e4743b7db8b531a1df6d84e80c7ce" gracePeriod=30 Feb 17 09:23:22 crc kubenswrapper[4848]: I0217 09:23:22.008472 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="caa30b8c-e57d-4b80-acb0-232c056de94e" containerName="sg-core" containerID="cri-o://d367362ba96077b36b4bd9dec4a67ff2fb87c5a88e59d5b3f7a6ea89d71bf759" gracePeriod=30 Feb 17 09:23:22 crc kubenswrapper[4848]: I0217 09:23:22.008514 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="caa30b8c-e57d-4b80-acb0-232c056de94e" containerName="ceilometer-notification-agent" containerID="cri-o://bb02958f3d5bcf25e322ca788b8297d5c5a42f78b848c9422900191b84e1d718" gracePeriod=30 Feb 17 09:23:23 crc kubenswrapper[4848]: I0217 09:23:23.023517 4848 generic.go:334] "Generic (PLEG): container finished" podID="caa30b8c-e57d-4b80-acb0-232c056de94e" containerID="1b3579f43dd32f27b8da514ac47c9e21222e4743b7db8b531a1df6d84e80c7ce" exitCode=0 Feb 17 09:23:23 crc kubenswrapper[4848]: I0217 09:23:23.024019 4848 generic.go:334] "Generic (PLEG): container finished" podID="caa30b8c-e57d-4b80-acb0-232c056de94e" containerID="d367362ba96077b36b4bd9dec4a67ff2fb87c5a88e59d5b3f7a6ea89d71bf759" exitCode=2 Feb 17 09:23:23 crc kubenswrapper[4848]: I0217 09:23:23.024038 4848 generic.go:334] "Generic (PLEG): container finished" podID="caa30b8c-e57d-4b80-acb0-232c056de94e" containerID="bb02958f3d5bcf25e322ca788b8297d5c5a42f78b848c9422900191b84e1d718" exitCode=0 Feb 17 09:23:23 crc kubenswrapper[4848]: I0217 09:23:23.023697 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"caa30b8c-e57d-4b80-acb0-232c056de94e","Type":"ContainerDied","Data":"1b3579f43dd32f27b8da514ac47c9e21222e4743b7db8b531a1df6d84e80c7ce"} Feb 17 09:23:23 crc kubenswrapper[4848]: I0217 09:23:23.024091 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"caa30b8c-e57d-4b80-acb0-232c056de94e","Type":"ContainerDied","Data":"d367362ba96077b36b4bd9dec4a67ff2fb87c5a88e59d5b3f7a6ea89d71bf759"} Feb 17 09:23:23 crc kubenswrapper[4848]: I0217 09:23:23.024115 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"caa30b8c-e57d-4b80-acb0-232c056de94e","Type":"ContainerDied","Data":"bb02958f3d5bcf25e322ca788b8297d5c5a42f78b848c9422900191b84e1d718"} Feb 17 09:23:25 crc kubenswrapper[4848]: I0217 09:23:25.060262 4848 generic.go:334] "Generic (PLEG): container finished" podID="caa30b8c-e57d-4b80-acb0-232c056de94e" containerID="e3521e1b628b3af1778beb4199c92eb28e540b6197fa01a76a9812c8c23f4e6a" exitCode=0 Feb 17 09:23:25 crc kubenswrapper[4848]: I0217 09:23:25.060761 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"caa30b8c-e57d-4b80-acb0-232c056de94e","Type":"ContainerDied","Data":"e3521e1b628b3af1778beb4199c92eb28e540b6197fa01a76a9812c8c23f4e6a"} Feb 17 09:23:25 crc kubenswrapper[4848]: I0217 09:23:25.271439 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:23:25 crc kubenswrapper[4848]: I0217 09:23:25.436752 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caa30b8c-e57d-4b80-acb0-232c056de94e-config-data\") pod \"caa30b8c-e57d-4b80-acb0-232c056de94e\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " Feb 17 09:23:25 crc kubenswrapper[4848]: I0217 09:23:25.436849 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/caa30b8c-e57d-4b80-acb0-232c056de94e-sg-core-conf-yaml\") pod \"caa30b8c-e57d-4b80-acb0-232c056de94e\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " Feb 17 09:23:25 crc kubenswrapper[4848]: I0217 09:23:25.436892 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/caa30b8c-e57d-4b80-acb0-232c056de94e-run-httpd\") pod \"caa30b8c-e57d-4b80-acb0-232c056de94e\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " Feb 17 09:23:25 crc kubenswrapper[4848]: I0217 09:23:25.436996 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvh52\" (UniqueName: \"kubernetes.io/projected/caa30b8c-e57d-4b80-acb0-232c056de94e-kube-api-access-gvh52\") pod \"caa30b8c-e57d-4b80-acb0-232c056de94e\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " Feb 17 09:23:25 crc kubenswrapper[4848]: I0217 09:23:25.437478 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caa30b8c-e57d-4b80-acb0-232c056de94e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "caa30b8c-e57d-4b80-acb0-232c056de94e" (UID: "caa30b8c-e57d-4b80-acb0-232c056de94e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:23:25 crc kubenswrapper[4848]: I0217 09:23:25.437517 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caa30b8c-e57d-4b80-acb0-232c056de94e-combined-ca-bundle\") pod \"caa30b8c-e57d-4b80-acb0-232c056de94e\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " Feb 17 09:23:25 crc kubenswrapper[4848]: I0217 09:23:25.437585 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/caa30b8c-e57d-4b80-acb0-232c056de94e-log-httpd\") pod \"caa30b8c-e57d-4b80-acb0-232c056de94e\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " Feb 17 09:23:25 crc kubenswrapper[4848]: I0217 09:23:25.437618 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/caa30b8c-e57d-4b80-acb0-232c056de94e-scripts\") pod \"caa30b8c-e57d-4b80-acb0-232c056de94e\" (UID: \"caa30b8c-e57d-4b80-acb0-232c056de94e\") " Feb 17 09:23:25 crc kubenswrapper[4848]: I0217 09:23:25.437890 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caa30b8c-e57d-4b80-acb0-232c056de94e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "caa30b8c-e57d-4b80-acb0-232c056de94e" (UID: "caa30b8c-e57d-4b80-acb0-232c056de94e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:23:25 crc kubenswrapper[4848]: I0217 09:23:25.438703 4848 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/caa30b8c-e57d-4b80-acb0-232c056de94e-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:25 crc kubenswrapper[4848]: I0217 09:23:25.438722 4848 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/caa30b8c-e57d-4b80-acb0-232c056de94e-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:25 crc kubenswrapper[4848]: I0217 09:23:25.442499 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caa30b8c-e57d-4b80-acb0-232c056de94e-scripts" (OuterVolumeSpecName: "scripts") pod "caa30b8c-e57d-4b80-acb0-232c056de94e" (UID: "caa30b8c-e57d-4b80-acb0-232c056de94e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:25 crc kubenswrapper[4848]: I0217 09:23:25.442879 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caa30b8c-e57d-4b80-acb0-232c056de94e-kube-api-access-gvh52" (OuterVolumeSpecName: "kube-api-access-gvh52") pod "caa30b8c-e57d-4b80-acb0-232c056de94e" (UID: "caa30b8c-e57d-4b80-acb0-232c056de94e"). InnerVolumeSpecName "kube-api-access-gvh52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:23:25 crc kubenswrapper[4848]: I0217 09:23:25.481064 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caa30b8c-e57d-4b80-acb0-232c056de94e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "caa30b8c-e57d-4b80-acb0-232c056de94e" (UID: "caa30b8c-e57d-4b80-acb0-232c056de94e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:25 crc kubenswrapper[4848]: I0217 09:23:25.520327 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caa30b8c-e57d-4b80-acb0-232c056de94e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "caa30b8c-e57d-4b80-acb0-232c056de94e" (UID: "caa30b8c-e57d-4b80-acb0-232c056de94e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:25 crc kubenswrapper[4848]: I0217 09:23:25.542141 4848 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/caa30b8c-e57d-4b80-acb0-232c056de94e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:25 crc kubenswrapper[4848]: I0217 09:23:25.542288 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvh52\" (UniqueName: \"kubernetes.io/projected/caa30b8c-e57d-4b80-acb0-232c056de94e-kube-api-access-gvh52\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:25 crc kubenswrapper[4848]: I0217 09:23:25.542352 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/caa30b8c-e57d-4b80-acb0-232c056de94e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:25 crc kubenswrapper[4848]: I0217 09:23:25.542423 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/caa30b8c-e57d-4b80-acb0-232c056de94e-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:25 crc kubenswrapper[4848]: I0217 09:23:25.555645 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caa30b8c-e57d-4b80-acb0-232c056de94e-config-data" (OuterVolumeSpecName: "config-data") pod "caa30b8c-e57d-4b80-acb0-232c056de94e" (UID: "caa30b8c-e57d-4b80-acb0-232c056de94e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:25 crc kubenswrapper[4848]: I0217 09:23:25.644432 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/caa30b8c-e57d-4b80-acb0-232c056de94e-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.079253 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"caa30b8c-e57d-4b80-acb0-232c056de94e","Type":"ContainerDied","Data":"cd71920c861dac317c9385bb830dc809ccd71bef1ea7052710ade2e5243a364b"} Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.079341 4848 scope.go:117] "RemoveContainer" containerID="1b3579f43dd32f27b8da514ac47c9e21222e4743b7db8b531a1df6d84e80c7ce" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.079449 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.082148 4848 generic.go:334] "Generic (PLEG): container finished" podID="910f96d1-b14a-49ea-9153-1fb90774711d" containerID="9cd17a4bb61419d7e3274e5c1432e3ebe90d529a0e6040483e8cb4d1168587ee" exitCode=0 Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.082405 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5js8t" event={"ID":"910f96d1-b14a-49ea-9153-1fb90774711d","Type":"ContainerDied","Data":"9cd17a4bb61419d7e3274e5c1432e3ebe90d529a0e6040483e8cb4d1168587ee"} Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.139824 4848 scope.go:117] "RemoveContainer" containerID="d367362ba96077b36b4bd9dec4a67ff2fb87c5a88e59d5b3f7a6ea89d71bf759" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.146475 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.169517 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.183578 4848 scope.go:117] "RemoveContainer" containerID="bb02958f3d5bcf25e322ca788b8297d5c5a42f78b848c9422900191b84e1d718" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.191624 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:23:26 crc kubenswrapper[4848]: E0217 09:23:26.192139 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caa30b8c-e57d-4b80-acb0-232c056de94e" containerName="sg-core" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.192157 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="caa30b8c-e57d-4b80-acb0-232c056de94e" containerName="sg-core" Feb 17 09:23:26 crc kubenswrapper[4848]: E0217 09:23:26.192189 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caa30b8c-e57d-4b80-acb0-232c056de94e" containerName="proxy-httpd" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.192197 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="caa30b8c-e57d-4b80-acb0-232c056de94e" containerName="proxy-httpd" Feb 17 09:23:26 crc kubenswrapper[4848]: E0217 09:23:26.192220 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caa30b8c-e57d-4b80-acb0-232c056de94e" containerName="ceilometer-central-agent" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.192230 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="caa30b8c-e57d-4b80-acb0-232c056de94e" containerName="ceilometer-central-agent" Feb 17 09:23:26 crc kubenswrapper[4848]: E0217 09:23:26.192251 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caa30b8c-e57d-4b80-acb0-232c056de94e" containerName="ceilometer-notification-agent" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.192260 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="caa30b8c-e57d-4b80-acb0-232c056de94e" containerName="ceilometer-notification-agent" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.192458 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="caa30b8c-e57d-4b80-acb0-232c056de94e" containerName="sg-core" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.192478 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="caa30b8c-e57d-4b80-acb0-232c056de94e" containerName="proxy-httpd" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.192497 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="caa30b8c-e57d-4b80-acb0-232c056de94e" containerName="ceilometer-central-agent" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.192514 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="caa30b8c-e57d-4b80-acb0-232c056de94e" containerName="ceilometer-notification-agent" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.194515 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.196956 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.198140 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.204106 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.228551 4848 scope.go:117] "RemoveContainer" containerID="e3521e1b628b3af1778beb4199c92eb28e540b6197fa01a76a9812c8c23f4e6a" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.365964 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf84l\" (UniqueName: \"kubernetes.io/projected/b9561f05-0df5-433a-ba20-050c453688cf-kube-api-access-nf84l\") pod \"ceilometer-0\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " pod="openstack/ceilometer-0" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.366162 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9561f05-0df5-433a-ba20-050c453688cf-run-httpd\") pod \"ceilometer-0\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " pod="openstack/ceilometer-0" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.366229 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9561f05-0df5-433a-ba20-050c453688cf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " pod="openstack/ceilometer-0" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.366271 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b9561f05-0df5-433a-ba20-050c453688cf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " pod="openstack/ceilometer-0" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.366320 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9561f05-0df5-433a-ba20-050c453688cf-scripts\") pod \"ceilometer-0\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " pod="openstack/ceilometer-0" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.366417 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9561f05-0df5-433a-ba20-050c453688cf-config-data\") pod \"ceilometer-0\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " pod="openstack/ceilometer-0" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.366479 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9561f05-0df5-433a-ba20-050c453688cf-log-httpd\") pod \"ceilometer-0\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " pod="openstack/ceilometer-0" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.468337 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf84l\" (UniqueName: \"kubernetes.io/projected/b9561f05-0df5-433a-ba20-050c453688cf-kube-api-access-nf84l\") pod \"ceilometer-0\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " pod="openstack/ceilometer-0" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.468508 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9561f05-0df5-433a-ba20-050c453688cf-run-httpd\") pod \"ceilometer-0\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " pod="openstack/ceilometer-0" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.468597 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9561f05-0df5-433a-ba20-050c453688cf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " pod="openstack/ceilometer-0" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.468656 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b9561f05-0df5-433a-ba20-050c453688cf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " pod="openstack/ceilometer-0" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.469431 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9561f05-0df5-433a-ba20-050c453688cf-run-httpd\") pod \"ceilometer-0\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " pod="openstack/ceilometer-0" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.469715 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9561f05-0df5-433a-ba20-050c453688cf-scripts\") pod \"ceilometer-0\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " pod="openstack/ceilometer-0" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.469891 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9561f05-0df5-433a-ba20-050c453688cf-config-data\") pod \"ceilometer-0\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " pod="openstack/ceilometer-0" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.469955 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9561f05-0df5-433a-ba20-050c453688cf-log-httpd\") pod \"ceilometer-0\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " pod="openstack/ceilometer-0" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.470531 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9561f05-0df5-433a-ba20-050c453688cf-log-httpd\") pod \"ceilometer-0\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " pod="openstack/ceilometer-0" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.474442 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9561f05-0df5-433a-ba20-050c453688cf-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " pod="openstack/ceilometer-0" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.475152 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9561f05-0df5-433a-ba20-050c453688cf-config-data\") pod \"ceilometer-0\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " pod="openstack/ceilometer-0" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.475618 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b9561f05-0df5-433a-ba20-050c453688cf-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " pod="openstack/ceilometer-0" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.475807 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9561f05-0df5-433a-ba20-050c453688cf-scripts\") pod \"ceilometer-0\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " pod="openstack/ceilometer-0" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.485579 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf84l\" (UniqueName: \"kubernetes.io/projected/b9561f05-0df5-433a-ba20-050c453688cf-kube-api-access-nf84l\") pod \"ceilometer-0\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " pod="openstack/ceilometer-0" Feb 17 09:23:26 crc kubenswrapper[4848]: I0217 09:23:26.530379 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:23:26 crc kubenswrapper[4848]: W0217 09:23:26.996505 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9561f05_0df5_433a_ba20_050c453688cf.slice/crio-c752e5d5211cbf7525f086dc9ae26057384173e77aa716a9bccc0c1594b2fef0 WatchSource:0}: Error finding container c752e5d5211cbf7525f086dc9ae26057384173e77aa716a9bccc0c1594b2fef0: Status 404 returned error can't find the container with id c752e5d5211cbf7525f086dc9ae26057384173e77aa716a9bccc0c1594b2fef0 Feb 17 09:23:27 crc kubenswrapper[4848]: I0217 09:23:27.008411 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:23:27 crc kubenswrapper[4848]: I0217 09:23:27.095542 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b9561f05-0df5-433a-ba20-050c453688cf","Type":"ContainerStarted","Data":"c752e5d5211cbf7525f086dc9ae26057384173e77aa716a9bccc0c1594b2fef0"} Feb 17 09:23:27 crc kubenswrapper[4848]: I0217 09:23:27.394223 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caa30b8c-e57d-4b80-acb0-232c056de94e" path="/var/lib/kubelet/pods/caa30b8c-e57d-4b80-acb0-232c056de94e/volumes" Feb 17 09:23:27 crc kubenswrapper[4848]: I0217 09:23:27.568374 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5js8t" Feb 17 09:23:27 crc kubenswrapper[4848]: I0217 09:23:27.691233 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/910f96d1-b14a-49ea-9153-1fb90774711d-scripts\") pod \"910f96d1-b14a-49ea-9153-1fb90774711d\" (UID: \"910f96d1-b14a-49ea-9153-1fb90774711d\") " Feb 17 09:23:27 crc kubenswrapper[4848]: I0217 09:23:27.691763 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/910f96d1-b14a-49ea-9153-1fb90774711d-config-data\") pod \"910f96d1-b14a-49ea-9153-1fb90774711d\" (UID: \"910f96d1-b14a-49ea-9153-1fb90774711d\") " Feb 17 09:23:27 crc kubenswrapper[4848]: I0217 09:23:27.691857 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/910f96d1-b14a-49ea-9153-1fb90774711d-combined-ca-bundle\") pod \"910f96d1-b14a-49ea-9153-1fb90774711d\" (UID: \"910f96d1-b14a-49ea-9153-1fb90774711d\") " Feb 17 09:23:27 crc kubenswrapper[4848]: I0217 09:23:27.692063 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4cr4g\" (UniqueName: \"kubernetes.io/projected/910f96d1-b14a-49ea-9153-1fb90774711d-kube-api-access-4cr4g\") pod \"910f96d1-b14a-49ea-9153-1fb90774711d\" (UID: \"910f96d1-b14a-49ea-9153-1fb90774711d\") " Feb 17 09:23:27 crc kubenswrapper[4848]: I0217 09:23:27.699968 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/910f96d1-b14a-49ea-9153-1fb90774711d-kube-api-access-4cr4g" (OuterVolumeSpecName: "kube-api-access-4cr4g") pod "910f96d1-b14a-49ea-9153-1fb90774711d" (UID: "910f96d1-b14a-49ea-9153-1fb90774711d"). InnerVolumeSpecName "kube-api-access-4cr4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:23:27 crc kubenswrapper[4848]: I0217 09:23:27.700035 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/910f96d1-b14a-49ea-9153-1fb90774711d-scripts" (OuterVolumeSpecName: "scripts") pod "910f96d1-b14a-49ea-9153-1fb90774711d" (UID: "910f96d1-b14a-49ea-9153-1fb90774711d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:27 crc kubenswrapper[4848]: I0217 09:23:27.738694 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/910f96d1-b14a-49ea-9153-1fb90774711d-config-data" (OuterVolumeSpecName: "config-data") pod "910f96d1-b14a-49ea-9153-1fb90774711d" (UID: "910f96d1-b14a-49ea-9153-1fb90774711d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:27 crc kubenswrapper[4848]: I0217 09:23:27.742146 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/910f96d1-b14a-49ea-9153-1fb90774711d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "910f96d1-b14a-49ea-9153-1fb90774711d" (UID: "910f96d1-b14a-49ea-9153-1fb90774711d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:27 crc kubenswrapper[4848]: I0217 09:23:27.794233 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/910f96d1-b14a-49ea-9153-1fb90774711d-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:27 crc kubenswrapper[4848]: I0217 09:23:27.794274 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/910f96d1-b14a-49ea-9153-1fb90774711d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:27 crc kubenswrapper[4848]: I0217 09:23:27.794286 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4cr4g\" (UniqueName: \"kubernetes.io/projected/910f96d1-b14a-49ea-9153-1fb90774711d-kube-api-access-4cr4g\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:27 crc kubenswrapper[4848]: I0217 09:23:27.794296 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/910f96d1-b14a-49ea-9153-1fb90774711d-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:28 crc kubenswrapper[4848]: I0217 09:23:28.135914 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b9561f05-0df5-433a-ba20-050c453688cf","Type":"ContainerStarted","Data":"d04f04f59fe19a6cb9d94967b2aa062774baf755ba75905812ac8aeea6330e49"} Feb 17 09:23:28 crc kubenswrapper[4848]: I0217 09:23:28.141045 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-5js8t" event={"ID":"910f96d1-b14a-49ea-9153-1fb90774711d","Type":"ContainerDied","Data":"d3934c403af1fd43f42fb7f1d946719959e4a62fefb0ee34da03fca7662b51db"} Feb 17 09:23:28 crc kubenswrapper[4848]: I0217 09:23:28.141088 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3934c403af1fd43f42fb7f1d946719959e4a62fefb0ee34da03fca7662b51db" Feb 17 09:23:28 crc kubenswrapper[4848]: I0217 09:23:28.141153 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-5js8t" Feb 17 09:23:28 crc kubenswrapper[4848]: I0217 09:23:28.210834 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 09:23:28 crc kubenswrapper[4848]: E0217 09:23:28.212682 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="910f96d1-b14a-49ea-9153-1fb90774711d" containerName="nova-cell0-conductor-db-sync" Feb 17 09:23:28 crc kubenswrapper[4848]: I0217 09:23:28.212732 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="910f96d1-b14a-49ea-9153-1fb90774711d" containerName="nova-cell0-conductor-db-sync" Feb 17 09:23:28 crc kubenswrapper[4848]: I0217 09:23:28.213647 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="910f96d1-b14a-49ea-9153-1fb90774711d" containerName="nova-cell0-conductor-db-sync" Feb 17 09:23:28 crc kubenswrapper[4848]: I0217 09:23:28.215412 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 09:23:28 crc kubenswrapper[4848]: I0217 09:23:28.232102 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 17 09:23:28 crc kubenswrapper[4848]: I0217 09:23:28.233101 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-vcxc2" Feb 17 09:23:28 crc kubenswrapper[4848]: I0217 09:23:28.255941 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 09:23:28 crc kubenswrapper[4848]: I0217 09:23:28.312145 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a8dbe1d-cea3-4cf7-a8ef-210410453732-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4a8dbe1d-cea3-4cf7-a8ef-210410453732\") " pod="openstack/nova-cell0-conductor-0" Feb 17 09:23:28 crc kubenswrapper[4848]: I0217 09:23:28.312583 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a8dbe1d-cea3-4cf7-a8ef-210410453732-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4a8dbe1d-cea3-4cf7-a8ef-210410453732\") " pod="openstack/nova-cell0-conductor-0" Feb 17 09:23:28 crc kubenswrapper[4848]: I0217 09:23:28.312604 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k55jj\" (UniqueName: \"kubernetes.io/projected/4a8dbe1d-cea3-4cf7-a8ef-210410453732-kube-api-access-k55jj\") pod \"nova-cell0-conductor-0\" (UID: \"4a8dbe1d-cea3-4cf7-a8ef-210410453732\") " pod="openstack/nova-cell0-conductor-0" Feb 17 09:23:28 crc kubenswrapper[4848]: I0217 09:23:28.414031 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a8dbe1d-cea3-4cf7-a8ef-210410453732-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4a8dbe1d-cea3-4cf7-a8ef-210410453732\") " pod="openstack/nova-cell0-conductor-0" Feb 17 09:23:28 crc kubenswrapper[4848]: I0217 09:23:28.414161 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a8dbe1d-cea3-4cf7-a8ef-210410453732-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4a8dbe1d-cea3-4cf7-a8ef-210410453732\") " pod="openstack/nova-cell0-conductor-0" Feb 17 09:23:28 crc kubenswrapper[4848]: I0217 09:23:28.414185 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k55jj\" (UniqueName: \"kubernetes.io/projected/4a8dbe1d-cea3-4cf7-a8ef-210410453732-kube-api-access-k55jj\") pod \"nova-cell0-conductor-0\" (UID: \"4a8dbe1d-cea3-4cf7-a8ef-210410453732\") " pod="openstack/nova-cell0-conductor-0" Feb 17 09:23:28 crc kubenswrapper[4848]: I0217 09:23:28.417935 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a8dbe1d-cea3-4cf7-a8ef-210410453732-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"4a8dbe1d-cea3-4cf7-a8ef-210410453732\") " pod="openstack/nova-cell0-conductor-0" Feb 17 09:23:28 crc kubenswrapper[4848]: I0217 09:23:28.427297 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a8dbe1d-cea3-4cf7-a8ef-210410453732-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"4a8dbe1d-cea3-4cf7-a8ef-210410453732\") " pod="openstack/nova-cell0-conductor-0" Feb 17 09:23:28 crc kubenswrapper[4848]: I0217 09:23:28.435253 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k55jj\" (UniqueName: \"kubernetes.io/projected/4a8dbe1d-cea3-4cf7-a8ef-210410453732-kube-api-access-k55jj\") pod \"nova-cell0-conductor-0\" (UID: \"4a8dbe1d-cea3-4cf7-a8ef-210410453732\") " pod="openstack/nova-cell0-conductor-0" Feb 17 09:23:28 crc kubenswrapper[4848]: I0217 09:23:28.609123 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 17 09:23:29 crc kubenswrapper[4848]: I0217 09:23:29.089736 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 17 09:23:29 crc kubenswrapper[4848]: W0217 09:23:29.096856 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a8dbe1d_cea3_4cf7_a8ef_210410453732.slice/crio-d80d2e812461a068a765c293c3e770fe9c4bb09ca31fee3b8cf8c0a15c0ef0d8 WatchSource:0}: Error finding container d80d2e812461a068a765c293c3e770fe9c4bb09ca31fee3b8cf8c0a15c0ef0d8: Status 404 returned error can't find the container with id d80d2e812461a068a765c293c3e770fe9c4bb09ca31fee3b8cf8c0a15c0ef0d8 Feb 17 09:23:29 crc kubenswrapper[4848]: I0217 09:23:29.153162 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b9561f05-0df5-433a-ba20-050c453688cf","Type":"ContainerStarted","Data":"44c92dc0f2b9efa669a9bc114c5761ece1690ad5eec95fa7446f14e7998dcbb0"} Feb 17 09:23:29 crc kubenswrapper[4848]: I0217 09:23:29.154549 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4a8dbe1d-cea3-4cf7-a8ef-210410453732","Type":"ContainerStarted","Data":"d80d2e812461a068a765c293c3e770fe9c4bb09ca31fee3b8cf8c0a15c0ef0d8"} Feb 17 09:23:30 crc kubenswrapper[4848]: I0217 09:23:30.167207 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"4a8dbe1d-cea3-4cf7-a8ef-210410453732","Type":"ContainerStarted","Data":"4621226df39c13ff7d34d887b23491981790e00e42d761b2a0d5b10fd8d6f390"} Feb 17 09:23:30 crc kubenswrapper[4848]: I0217 09:23:30.167793 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 17 09:23:30 crc kubenswrapper[4848]: I0217 09:23:30.171570 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b9561f05-0df5-433a-ba20-050c453688cf","Type":"ContainerStarted","Data":"43d0e7b153ad0126ffeead7250c7f7285d20202caf5e85afe967af2b92a79eb4"} Feb 17 09:23:30 crc kubenswrapper[4848]: I0217 09:23:30.186282 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.186253272 podStartE2EDuration="2.186253272s" podCreationTimestamp="2026-02-17 09:23:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:23:30.182094689 +0000 UTC m=+1087.725350335" watchObservedRunningTime="2026-02-17 09:23:30.186253272 +0000 UTC m=+1087.729508918" Feb 17 09:23:31 crc kubenswrapper[4848]: I0217 09:23:31.184678 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b9561f05-0df5-433a-ba20-050c453688cf","Type":"ContainerStarted","Data":"2cfcb605b880a51361168614dac8aa19462a94fd08f2a65b21c3cf7f30393565"} Feb 17 09:23:31 crc kubenswrapper[4848]: I0217 09:23:31.243443 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.310238546 podStartE2EDuration="5.243424465s" podCreationTimestamp="2026-02-17 09:23:26 +0000 UTC" firstStartedPulling="2026-02-17 09:23:26.999567558 +0000 UTC m=+1084.542823224" lastFinishedPulling="2026-02-17 09:23:30.932753497 +0000 UTC m=+1088.476009143" observedRunningTime="2026-02-17 09:23:31.241691808 +0000 UTC m=+1088.784947474" watchObservedRunningTime="2026-02-17 09:23:31.243424465 +0000 UTC m=+1088.786680121" Feb 17 09:23:32 crc kubenswrapper[4848]: I0217 09:23:32.196068 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 09:23:38 crc kubenswrapper[4848]: I0217 09:23:38.650103 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.240724 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-h9vlc"] Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.242930 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h9vlc" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.245007 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.258059 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.270588 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-h9vlc"] Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.364391 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlhcb\" (UniqueName: \"kubernetes.io/projected/7a4198c9-c56f-45e9-90f7-486ddcb9d65f-kube-api-access-mlhcb\") pod \"nova-cell0-cell-mapping-h9vlc\" (UID: \"7a4198c9-c56f-45e9-90f7-486ddcb9d65f\") " pod="openstack/nova-cell0-cell-mapping-h9vlc" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.364791 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a4198c9-c56f-45e9-90f7-486ddcb9d65f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-h9vlc\" (UID: \"7a4198c9-c56f-45e9-90f7-486ddcb9d65f\") " pod="openstack/nova-cell0-cell-mapping-h9vlc" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.364844 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a4198c9-c56f-45e9-90f7-486ddcb9d65f-scripts\") pod \"nova-cell0-cell-mapping-h9vlc\" (UID: \"7a4198c9-c56f-45e9-90f7-486ddcb9d65f\") " pod="openstack/nova-cell0-cell-mapping-h9vlc" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.364947 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a4198c9-c56f-45e9-90f7-486ddcb9d65f-config-data\") pod \"nova-cell0-cell-mapping-h9vlc\" (UID: \"7a4198c9-c56f-45e9-90f7-486ddcb9d65f\") " pod="openstack/nova-cell0-cell-mapping-h9vlc" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.448513 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.454445 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.456658 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.461559 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.466191 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a4198c9-c56f-45e9-90f7-486ddcb9d65f-config-data\") pod \"nova-cell0-cell-mapping-h9vlc\" (UID: \"7a4198c9-c56f-45e9-90f7-486ddcb9d65f\") " pod="openstack/nova-cell0-cell-mapping-h9vlc" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.466290 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlhcb\" (UniqueName: \"kubernetes.io/projected/7a4198c9-c56f-45e9-90f7-486ddcb9d65f-kube-api-access-mlhcb\") pod \"nova-cell0-cell-mapping-h9vlc\" (UID: \"7a4198c9-c56f-45e9-90f7-486ddcb9d65f\") " pod="openstack/nova-cell0-cell-mapping-h9vlc" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.466314 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a4198c9-c56f-45e9-90f7-486ddcb9d65f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-h9vlc\" (UID: \"7a4198c9-c56f-45e9-90f7-486ddcb9d65f\") " pod="openstack/nova-cell0-cell-mapping-h9vlc" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.466349 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a4198c9-c56f-45e9-90f7-486ddcb9d65f-scripts\") pod \"nova-cell0-cell-mapping-h9vlc\" (UID: \"7a4198c9-c56f-45e9-90f7-486ddcb9d65f\") " pod="openstack/nova-cell0-cell-mapping-h9vlc" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.472273 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a4198c9-c56f-45e9-90f7-486ddcb9d65f-config-data\") pod \"nova-cell0-cell-mapping-h9vlc\" (UID: \"7a4198c9-c56f-45e9-90f7-486ddcb9d65f\") " pod="openstack/nova-cell0-cell-mapping-h9vlc" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.474011 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a4198c9-c56f-45e9-90f7-486ddcb9d65f-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-h9vlc\" (UID: \"7a4198c9-c56f-45e9-90f7-486ddcb9d65f\") " pod="openstack/nova-cell0-cell-mapping-h9vlc" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.477141 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a4198c9-c56f-45e9-90f7-486ddcb9d65f-scripts\") pod \"nova-cell0-cell-mapping-h9vlc\" (UID: \"7a4198c9-c56f-45e9-90f7-486ddcb9d65f\") " pod="openstack/nova-cell0-cell-mapping-h9vlc" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.490344 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlhcb\" (UniqueName: \"kubernetes.io/projected/7a4198c9-c56f-45e9-90f7-486ddcb9d65f-kube-api-access-mlhcb\") pod \"nova-cell0-cell-mapping-h9vlc\" (UID: \"7a4198c9-c56f-45e9-90f7-486ddcb9d65f\") " pod="openstack/nova-cell0-cell-mapping-h9vlc" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.548387 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.552143 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.554085 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.567727 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22f73908-5ca1-44fb-ad87-1f1b709284bf-logs\") pod \"nova-api-0\" (UID: \"22f73908-5ca1-44fb-ad87-1f1b709284bf\") " pod="openstack/nova-api-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.567894 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22f73908-5ca1-44fb-ad87-1f1b709284bf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"22f73908-5ca1-44fb-ad87-1f1b709284bf\") " pod="openstack/nova-api-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.567950 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmdn9\" (UniqueName: \"kubernetes.io/projected/22f73908-5ca1-44fb-ad87-1f1b709284bf-kube-api-access-wmdn9\") pod \"nova-api-0\" (UID: \"22f73908-5ca1-44fb-ad87-1f1b709284bf\") " pod="openstack/nova-api-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.567979 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22f73908-5ca1-44fb-ad87-1f1b709284bf-config-data\") pod \"nova-api-0\" (UID: \"22f73908-5ca1-44fb-ad87-1f1b709284bf\") " pod="openstack/nova-api-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.573819 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.575739 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.577443 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.577964 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h9vlc" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.622840 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.624096 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.629606 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.653924 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.695026 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.708639 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22f73908-5ca1-44fb-ad87-1f1b709284bf-logs\") pod \"nova-api-0\" (UID: \"22f73908-5ca1-44fb-ad87-1f1b709284bf\") " pod="openstack/nova-api-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.708685 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0a24b3-51ac-4726-ac00-f6d52f2b1bad-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a0a24b3-51ac-4726-ac00-f6d52f2b1bad\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.708741 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06701a3a-e547-4022-87bc-9e40af29efa4-config-data\") pod \"nova-metadata-0\" (UID: \"06701a3a-e547-4022-87bc-9e40af29efa4\") " pod="openstack/nova-metadata-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.708789 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kw5f\" (UniqueName: \"kubernetes.io/projected/06701a3a-e547-4022-87bc-9e40af29efa4-kube-api-access-5kw5f\") pod \"nova-metadata-0\" (UID: \"06701a3a-e547-4022-87bc-9e40af29efa4\") " pod="openstack/nova-metadata-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.708839 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06701a3a-e547-4022-87bc-9e40af29efa4-logs\") pod \"nova-metadata-0\" (UID: \"06701a3a-e547-4022-87bc-9e40af29efa4\") " pod="openstack/nova-metadata-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.708899 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a0a24b3-51ac-4726-ac00-f6d52f2b1bad-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a0a24b3-51ac-4726-ac00-f6d52f2b1bad\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.708927 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8mpq\" (UniqueName: \"kubernetes.io/projected/0a0a24b3-51ac-4726-ac00-f6d52f2b1bad-kube-api-access-s8mpq\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a0a24b3-51ac-4726-ac00-f6d52f2b1bad\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.708994 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22f73908-5ca1-44fb-ad87-1f1b709284bf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"22f73908-5ca1-44fb-ad87-1f1b709284bf\") " pod="openstack/nova-api-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.709074 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmdn9\" (UniqueName: \"kubernetes.io/projected/22f73908-5ca1-44fb-ad87-1f1b709284bf-kube-api-access-wmdn9\") pod \"nova-api-0\" (UID: \"22f73908-5ca1-44fb-ad87-1f1b709284bf\") " pod="openstack/nova-api-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.709102 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22f73908-5ca1-44fb-ad87-1f1b709284bf-config-data\") pod \"nova-api-0\" (UID: \"22f73908-5ca1-44fb-ad87-1f1b709284bf\") " pod="openstack/nova-api-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.709146 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06701a3a-e547-4022-87bc-9e40af29efa4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"06701a3a-e547-4022-87bc-9e40af29efa4\") " pod="openstack/nova-metadata-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.718494 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22f73908-5ca1-44fb-ad87-1f1b709284bf-logs\") pod \"nova-api-0\" (UID: \"22f73908-5ca1-44fb-ad87-1f1b709284bf\") " pod="openstack/nova-api-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.728295 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22f73908-5ca1-44fb-ad87-1f1b709284bf-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"22f73908-5ca1-44fb-ad87-1f1b709284bf\") " pod="openstack/nova-api-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.736276 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.736914 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22f73908-5ca1-44fb-ad87-1f1b709284bf-config-data\") pod \"nova-api-0\" (UID: \"22f73908-5ca1-44fb-ad87-1f1b709284bf\") " pod="openstack/nova-api-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.761179 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmdn9\" (UniqueName: \"kubernetes.io/projected/22f73908-5ca1-44fb-ad87-1f1b709284bf-kube-api-access-wmdn9\") pod \"nova-api-0\" (UID: \"22f73908-5ca1-44fb-ad87-1f1b709284bf\") " pod="openstack/nova-api-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.794801 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bc699f5c5-b64x5"] Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.796854 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.811211 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e104ad7-09b3-4881-857c-0384ea158171-config-data\") pod \"nova-scheduler-0\" (UID: \"0e104ad7-09b3-4881-857c-0384ea158171\") " pod="openstack/nova-scheduler-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.811325 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06701a3a-e547-4022-87bc-9e40af29efa4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"06701a3a-e547-4022-87bc-9e40af29efa4\") " pod="openstack/nova-metadata-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.811395 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0a24b3-51ac-4726-ac00-f6d52f2b1bad-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a0a24b3-51ac-4726-ac00-f6d52f2b1bad\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.811449 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06701a3a-e547-4022-87bc-9e40af29efa4-config-data\") pod \"nova-metadata-0\" (UID: \"06701a3a-e547-4022-87bc-9e40af29efa4\") " pod="openstack/nova-metadata-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.811493 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kw5f\" (UniqueName: \"kubernetes.io/projected/06701a3a-e547-4022-87bc-9e40af29efa4-kube-api-access-5kw5f\") pod \"nova-metadata-0\" (UID: \"06701a3a-e547-4022-87bc-9e40af29efa4\") " pod="openstack/nova-metadata-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.811570 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06701a3a-e547-4022-87bc-9e40af29efa4-logs\") pod \"nova-metadata-0\" (UID: \"06701a3a-e547-4022-87bc-9e40af29efa4\") " pod="openstack/nova-metadata-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.811603 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8km9p\" (UniqueName: \"kubernetes.io/projected/0e104ad7-09b3-4881-857c-0384ea158171-kube-api-access-8km9p\") pod \"nova-scheduler-0\" (UID: \"0e104ad7-09b3-4881-857c-0384ea158171\") " pod="openstack/nova-scheduler-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.812566 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a0a24b3-51ac-4726-ac00-f6d52f2b1bad-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a0a24b3-51ac-4726-ac00-f6d52f2b1bad\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.812603 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8mpq\" (UniqueName: \"kubernetes.io/projected/0a0a24b3-51ac-4726-ac00-f6d52f2b1bad-kube-api-access-s8mpq\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a0a24b3-51ac-4726-ac00-f6d52f2b1bad\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.812649 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e104ad7-09b3-4881-857c-0384ea158171-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0e104ad7-09b3-4881-857c-0384ea158171\") " pod="openstack/nova-scheduler-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.815312 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06701a3a-e547-4022-87bc-9e40af29efa4-logs\") pod \"nova-metadata-0\" (UID: \"06701a3a-e547-4022-87bc-9e40af29efa4\") " pod="openstack/nova-metadata-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.818724 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06701a3a-e547-4022-87bc-9e40af29efa4-config-data\") pod \"nova-metadata-0\" (UID: \"06701a3a-e547-4022-87bc-9e40af29efa4\") " pod="openstack/nova-metadata-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.823122 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a0a24b3-51ac-4726-ac00-f6d52f2b1bad-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a0a24b3-51ac-4726-ac00-f6d52f2b1bad\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.823244 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06701a3a-e547-4022-87bc-9e40af29efa4-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"06701a3a-e547-4022-87bc-9e40af29efa4\") " pod="openstack/nova-metadata-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.824594 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0a24b3-51ac-4726-ac00-f6d52f2b1bad-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a0a24b3-51ac-4726-ac00-f6d52f2b1bad\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.832407 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc699f5c5-b64x5"] Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.832928 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kw5f\" (UniqueName: \"kubernetes.io/projected/06701a3a-e547-4022-87bc-9e40af29efa4-kube-api-access-5kw5f\") pod \"nova-metadata-0\" (UID: \"06701a3a-e547-4022-87bc-9e40af29efa4\") " pod="openstack/nova-metadata-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.844437 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8mpq\" (UniqueName: \"kubernetes.io/projected/0a0a24b3-51ac-4726-ac00-f6d52f2b1bad-kube-api-access-s8mpq\") pod \"nova-cell1-novncproxy-0\" (UID: \"0a0a24b3-51ac-4726-ac00-f6d52f2b1bad\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.858628 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.870856 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.900853 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.914023 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-dns-svc\") pod \"dnsmasq-dns-6bc699f5c5-b64x5\" (UID: \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\") " pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.914109 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc699f5c5-b64x5\" (UID: \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\") " pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.914220 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-dns-swift-storage-0\") pod \"dnsmasq-dns-6bc699f5c5-b64x5\" (UID: \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\") " pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.914260 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-ovsdbserver-nb\") pod \"dnsmasq-dns-6bc699f5c5-b64x5\" (UID: \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\") " pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.914290 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8km9p\" (UniqueName: \"kubernetes.io/projected/0e104ad7-09b3-4881-857c-0384ea158171-kube-api-access-8km9p\") pod \"nova-scheduler-0\" (UID: \"0e104ad7-09b3-4881-857c-0384ea158171\") " pod="openstack/nova-scheduler-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.914353 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e104ad7-09b3-4881-857c-0384ea158171-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0e104ad7-09b3-4881-857c-0384ea158171\") " pod="openstack/nova-scheduler-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.914407 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-config\") pod \"dnsmasq-dns-6bc699f5c5-b64x5\" (UID: \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\") " pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.914425 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e104ad7-09b3-4881-857c-0384ea158171-config-data\") pod \"nova-scheduler-0\" (UID: \"0e104ad7-09b3-4881-857c-0384ea158171\") " pod="openstack/nova-scheduler-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.914443 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnq2p\" (UniqueName: \"kubernetes.io/projected/fe686229-24bb-4b7c-aab8-64f78e05d1f1-kube-api-access-jnq2p\") pod \"dnsmasq-dns-6bc699f5c5-b64x5\" (UID: \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\") " pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.923640 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e104ad7-09b3-4881-857c-0384ea158171-config-data\") pod \"nova-scheduler-0\" (UID: \"0e104ad7-09b3-4881-857c-0384ea158171\") " pod="openstack/nova-scheduler-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.924964 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e104ad7-09b3-4881-857c-0384ea158171-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"0e104ad7-09b3-4881-857c-0384ea158171\") " pod="openstack/nova-scheduler-0" Feb 17 09:23:39 crc kubenswrapper[4848]: I0217 09:23:39.932358 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8km9p\" (UniqueName: \"kubernetes.io/projected/0e104ad7-09b3-4881-857c-0384ea158171-kube-api-access-8km9p\") pod \"nova-scheduler-0\" (UID: \"0e104ad7-09b3-4881-857c-0384ea158171\") " pod="openstack/nova-scheduler-0" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.016022 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-dns-swift-storage-0\") pod \"dnsmasq-dns-6bc699f5c5-b64x5\" (UID: \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\") " pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.016058 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-ovsdbserver-nb\") pod \"dnsmasq-dns-6bc699f5c5-b64x5\" (UID: \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\") " pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.016117 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-config\") pod \"dnsmasq-dns-6bc699f5c5-b64x5\" (UID: \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\") " pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.016135 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnq2p\" (UniqueName: \"kubernetes.io/projected/fe686229-24bb-4b7c-aab8-64f78e05d1f1-kube-api-access-jnq2p\") pod \"dnsmasq-dns-6bc699f5c5-b64x5\" (UID: \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\") " pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.016165 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-dns-svc\") pod \"dnsmasq-dns-6bc699f5c5-b64x5\" (UID: \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\") " pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.016193 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc699f5c5-b64x5\" (UID: \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\") " pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.017272 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-ovsdbserver-sb\") pod \"dnsmasq-dns-6bc699f5c5-b64x5\" (UID: \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\") " pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.017838 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-dns-swift-storage-0\") pod \"dnsmasq-dns-6bc699f5c5-b64x5\" (UID: \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\") " pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.018320 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-ovsdbserver-nb\") pod \"dnsmasq-dns-6bc699f5c5-b64x5\" (UID: \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\") " pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.019210 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-config\") pod \"dnsmasq-dns-6bc699f5c5-b64x5\" (UID: \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\") " pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.019600 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.020400 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-dns-svc\") pod \"dnsmasq-dns-6bc699f5c5-b64x5\" (UID: \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\") " pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.056914 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnq2p\" (UniqueName: \"kubernetes.io/projected/fe686229-24bb-4b7c-aab8-64f78e05d1f1-kube-api-access-jnq2p\") pod \"dnsmasq-dns-6bc699f5c5-b64x5\" (UID: \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\") " pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.117175 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.197152 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-h9vlc"] Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.287070 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h9vlc" event={"ID":"7a4198c9-c56f-45e9-90f7-486ddcb9d65f","Type":"ContainerStarted","Data":"d2933fe557f6f89db33290275d95d52d92327b3979a8a03101ceb46a06ce3894"} Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.310855 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zgwn9"] Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.312295 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zgwn9" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.316262 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.316550 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.321151 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zgwn9"] Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.425442 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwpbx\" (UniqueName: \"kubernetes.io/projected/5f95863d-9560-4508-9115-5da47d8dd4c2-kube-api-access-gwpbx\") pod \"nova-cell1-conductor-db-sync-zgwn9\" (UID: \"5f95863d-9560-4508-9115-5da47d8dd4c2\") " pod="openstack/nova-cell1-conductor-db-sync-zgwn9" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.425965 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f95863d-9560-4508-9115-5da47d8dd4c2-config-data\") pod \"nova-cell1-conductor-db-sync-zgwn9\" (UID: \"5f95863d-9560-4508-9115-5da47d8dd4c2\") " pod="openstack/nova-cell1-conductor-db-sync-zgwn9" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.426049 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f95863d-9560-4508-9115-5da47d8dd4c2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zgwn9\" (UID: \"5f95863d-9560-4508-9115-5da47d8dd4c2\") " pod="openstack/nova-cell1-conductor-db-sync-zgwn9" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.426193 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f95863d-9560-4508-9115-5da47d8dd4c2-scripts\") pod \"nova-cell1-conductor-db-sync-zgwn9\" (UID: \"5f95863d-9560-4508-9115-5da47d8dd4c2\") " pod="openstack/nova-cell1-conductor-db-sync-zgwn9" Feb 17 09:23:40 crc kubenswrapper[4848]: W0217 09:23:40.480115 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22f73908_5ca1_44fb_ad87_1f1b709284bf.slice/crio-7f6729c76284594877686b5d086b5a4ffb30ad4216f401cc301e3996a7a006e3 WatchSource:0}: Error finding container 7f6729c76284594877686b5d086b5a4ffb30ad4216f401cc301e3996a7a006e3: Status 404 returned error can't find the container with id 7f6729c76284594877686b5d086b5a4ffb30ad4216f401cc301e3996a7a006e3 Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.480715 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.513669 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.527350 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwpbx\" (UniqueName: \"kubernetes.io/projected/5f95863d-9560-4508-9115-5da47d8dd4c2-kube-api-access-gwpbx\") pod \"nova-cell1-conductor-db-sync-zgwn9\" (UID: \"5f95863d-9560-4508-9115-5da47d8dd4c2\") " pod="openstack/nova-cell1-conductor-db-sync-zgwn9" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.527517 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f95863d-9560-4508-9115-5da47d8dd4c2-config-data\") pod \"nova-cell1-conductor-db-sync-zgwn9\" (UID: \"5f95863d-9560-4508-9115-5da47d8dd4c2\") " pod="openstack/nova-cell1-conductor-db-sync-zgwn9" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.527600 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f95863d-9560-4508-9115-5da47d8dd4c2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zgwn9\" (UID: \"5f95863d-9560-4508-9115-5da47d8dd4c2\") " pod="openstack/nova-cell1-conductor-db-sync-zgwn9" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.527710 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f95863d-9560-4508-9115-5da47d8dd4c2-scripts\") pod \"nova-cell1-conductor-db-sync-zgwn9\" (UID: \"5f95863d-9560-4508-9115-5da47d8dd4c2\") " pod="openstack/nova-cell1-conductor-db-sync-zgwn9" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.534859 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f95863d-9560-4508-9115-5da47d8dd4c2-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-zgwn9\" (UID: \"5f95863d-9560-4508-9115-5da47d8dd4c2\") " pod="openstack/nova-cell1-conductor-db-sync-zgwn9" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.536305 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f95863d-9560-4508-9115-5da47d8dd4c2-scripts\") pod \"nova-cell1-conductor-db-sync-zgwn9\" (UID: \"5f95863d-9560-4508-9115-5da47d8dd4c2\") " pod="openstack/nova-cell1-conductor-db-sync-zgwn9" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.536348 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f95863d-9560-4508-9115-5da47d8dd4c2-config-data\") pod \"nova-cell1-conductor-db-sync-zgwn9\" (UID: \"5f95863d-9560-4508-9115-5da47d8dd4c2\") " pod="openstack/nova-cell1-conductor-db-sync-zgwn9" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.553814 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwpbx\" (UniqueName: \"kubernetes.io/projected/5f95863d-9560-4508-9115-5da47d8dd4c2-kube-api-access-gwpbx\") pod \"nova-cell1-conductor-db-sync-zgwn9\" (UID: \"5f95863d-9560-4508-9115-5da47d8dd4c2\") " pod="openstack/nova-cell1-conductor-db-sync-zgwn9" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.602454 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 09:23:40 crc kubenswrapper[4848]: W0217 09:23:40.608278 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a0a24b3_51ac_4726_ac00_f6d52f2b1bad.slice/crio-d6a047b039309da3b1aa628f04b650c5eebc65a4baf6288e5982ee90da896f26 WatchSource:0}: Error finding container d6a047b039309da3b1aa628f04b650c5eebc65a4baf6288e5982ee90da896f26: Status 404 returned error can't find the container with id d6a047b039309da3b1aa628f04b650c5eebc65a4baf6288e5982ee90da896f26 Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.635890 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zgwn9" Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.735403 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 09:23:40 crc kubenswrapper[4848]: I0217 09:23:40.743968 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bc699f5c5-b64x5"] Feb 17 09:23:40 crc kubenswrapper[4848]: W0217 09:23:40.746718 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe686229_24bb_4b7c_aab8_64f78e05d1f1.slice/crio-514c2df53dfeb0334871b82f56dc63f6efb7abb92afaeb314b0d6d5072614d38 WatchSource:0}: Error finding container 514c2df53dfeb0334871b82f56dc63f6efb7abb92afaeb314b0d6d5072614d38: Status 404 returned error can't find the container with id 514c2df53dfeb0334871b82f56dc63f6efb7abb92afaeb314b0d6d5072614d38 Feb 17 09:23:41 crc kubenswrapper[4848]: I0217 09:23:41.076515 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zgwn9"] Feb 17 09:23:41 crc kubenswrapper[4848]: W0217 09:23:41.090118 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f95863d_9560_4508_9115_5da47d8dd4c2.slice/crio-ceed4f850028fd13ab660b3be8e6ba8731fdb2261f35f3e612e7b1ac6135e2f8 WatchSource:0}: Error finding container ceed4f850028fd13ab660b3be8e6ba8731fdb2261f35f3e612e7b1ac6135e2f8: Status 404 returned error can't find the container with id ceed4f850028fd13ab660b3be8e6ba8731fdb2261f35f3e612e7b1ac6135e2f8 Feb 17 09:23:41 crc kubenswrapper[4848]: I0217 09:23:41.302279 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0a0a24b3-51ac-4726-ac00-f6d52f2b1bad","Type":"ContainerStarted","Data":"d6a047b039309da3b1aa628f04b650c5eebc65a4baf6288e5982ee90da896f26"} Feb 17 09:23:41 crc kubenswrapper[4848]: I0217 09:23:41.304177 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"06701a3a-e547-4022-87bc-9e40af29efa4","Type":"ContainerStarted","Data":"02b23b20c8816d77e5e150ad657ba14607b7bfae53d0c1f919b597c6716b9860"} Feb 17 09:23:41 crc kubenswrapper[4848]: I0217 09:23:41.306493 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h9vlc" event={"ID":"7a4198c9-c56f-45e9-90f7-486ddcb9d65f","Type":"ContainerStarted","Data":"13a3939801e3fe31e0d33cb6fb4431afa3e5044ec2189bea83022200dd7894fb"} Feb 17 09:23:41 crc kubenswrapper[4848]: I0217 09:23:41.308424 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22f73908-5ca1-44fb-ad87-1f1b709284bf","Type":"ContainerStarted","Data":"7f6729c76284594877686b5d086b5a4ffb30ad4216f401cc301e3996a7a006e3"} Feb 17 09:23:41 crc kubenswrapper[4848]: I0217 09:23:41.313957 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zgwn9" event={"ID":"5f95863d-9560-4508-9115-5da47d8dd4c2","Type":"ContainerStarted","Data":"d5214704be6a231b3d99cc7ed4e4eb92e94a850437d836cae3b99d1131d1cb01"} Feb 17 09:23:41 crc kubenswrapper[4848]: I0217 09:23:41.314011 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zgwn9" event={"ID":"5f95863d-9560-4508-9115-5da47d8dd4c2","Type":"ContainerStarted","Data":"ceed4f850028fd13ab660b3be8e6ba8731fdb2261f35f3e612e7b1ac6135e2f8"} Feb 17 09:23:41 crc kubenswrapper[4848]: I0217 09:23:41.316603 4848 generic.go:334] "Generic (PLEG): container finished" podID="fe686229-24bb-4b7c-aab8-64f78e05d1f1" containerID="858a90c2d08c5073a90808dff41266276dc577d8e48dbf8b9888ce027e11462d" exitCode=0 Feb 17 09:23:41 crc kubenswrapper[4848]: I0217 09:23:41.316673 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" event={"ID":"fe686229-24bb-4b7c-aab8-64f78e05d1f1","Type":"ContainerDied","Data":"858a90c2d08c5073a90808dff41266276dc577d8e48dbf8b9888ce027e11462d"} Feb 17 09:23:41 crc kubenswrapper[4848]: I0217 09:23:41.316700 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" event={"ID":"fe686229-24bb-4b7c-aab8-64f78e05d1f1","Type":"ContainerStarted","Data":"514c2df53dfeb0334871b82f56dc63f6efb7abb92afaeb314b0d6d5072614d38"} Feb 17 09:23:41 crc kubenswrapper[4848]: I0217 09:23:41.330181 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0e104ad7-09b3-4881-857c-0384ea158171","Type":"ContainerStarted","Data":"6ad283ac171c331651cb0b4485d0a010232e902c99be20abf3f22c5c6c951855"} Feb 17 09:23:41 crc kubenswrapper[4848]: I0217 09:23:41.344172 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-h9vlc" podStartSLOduration=2.34415516 podStartE2EDuration="2.34415516s" podCreationTimestamp="2026-02-17 09:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:23:41.341894589 +0000 UTC m=+1098.885150235" watchObservedRunningTime="2026-02-17 09:23:41.34415516 +0000 UTC m=+1098.887410806" Feb 17 09:23:41 crc kubenswrapper[4848]: I0217 09:23:41.436031 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-zgwn9" podStartSLOduration=1.43600727 podStartE2EDuration="1.43600727s" podCreationTimestamp="2026-02-17 09:23:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:23:41.405010593 +0000 UTC m=+1098.948266249" watchObservedRunningTime="2026-02-17 09:23:41.43600727 +0000 UTC m=+1098.979262916" Feb 17 09:23:42 crc kubenswrapper[4848]: I0217 09:23:42.340753 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" event={"ID":"fe686229-24bb-4b7c-aab8-64f78e05d1f1","Type":"ContainerStarted","Data":"602cd7a4b20ac413d3192147119f005fe6c9216e11594610180690750bd6da1b"} Feb 17 09:23:42 crc kubenswrapper[4848]: I0217 09:23:42.369912 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" podStartSLOduration=3.369886486 podStartE2EDuration="3.369886486s" podCreationTimestamp="2026-02-17 09:23:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:23:42.361398956 +0000 UTC m=+1099.904654642" watchObservedRunningTime="2026-02-17 09:23:42.369886486 +0000 UTC m=+1099.913142142" Feb 17 09:23:42 crc kubenswrapper[4848]: I0217 09:23:42.740238 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 09:23:42 crc kubenswrapper[4848]: I0217 09:23:42.753380 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 09:23:43 crc kubenswrapper[4848]: I0217 09:23:43.357017 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" Feb 17 09:23:44 crc kubenswrapper[4848]: I0217 09:23:44.371859 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0a0a24b3-51ac-4726-ac00-f6d52f2b1bad","Type":"ContainerStarted","Data":"e901a001e8a4d54934842433f03798ea569c729c276d62fbb2e833a436656469"} Feb 17 09:23:44 crc kubenswrapper[4848]: I0217 09:23:44.371896 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="0a0a24b3-51ac-4726-ac00-f6d52f2b1bad" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://e901a001e8a4d54934842433f03798ea569c729c276d62fbb2e833a436656469" gracePeriod=30 Feb 17 09:23:44 crc kubenswrapper[4848]: I0217 09:23:44.377051 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="06701a3a-e547-4022-87bc-9e40af29efa4" containerName="nova-metadata-log" containerID="cri-o://181a3d720872d63c55b11f98944242b678b07251076ea93b9af9e22cea506330" gracePeriod=30 Feb 17 09:23:44 crc kubenswrapper[4848]: I0217 09:23:44.377194 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="06701a3a-e547-4022-87bc-9e40af29efa4" containerName="nova-metadata-metadata" containerID="cri-o://e1f48c2c6f6c149f720e583f194e434cb3c2718f541e2e42593fca6f3c620d76" gracePeriod=30 Feb 17 09:23:44 crc kubenswrapper[4848]: I0217 09:23:44.377589 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"06701a3a-e547-4022-87bc-9e40af29efa4","Type":"ContainerStarted","Data":"e1f48c2c6f6c149f720e583f194e434cb3c2718f541e2e42593fca6f3c620d76"} Feb 17 09:23:44 crc kubenswrapper[4848]: I0217 09:23:44.377794 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"06701a3a-e547-4022-87bc-9e40af29efa4","Type":"ContainerStarted","Data":"181a3d720872d63c55b11f98944242b678b07251076ea93b9af9e22cea506330"} Feb 17 09:23:44 crc kubenswrapper[4848]: I0217 09:23:44.381638 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22f73908-5ca1-44fb-ad87-1f1b709284bf","Type":"ContainerStarted","Data":"4d65192a6f64ec28bb40ebe76ae41364f9d1f4a6d89f9ebfcc6521de75dd4964"} Feb 17 09:23:44 crc kubenswrapper[4848]: I0217 09:23:44.381722 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22f73908-5ca1-44fb-ad87-1f1b709284bf","Type":"ContainerStarted","Data":"ed0cb2c265ced71f9d2a48479837cae3e2aaf759c70ef46c03283df848fe0481"} Feb 17 09:23:44 crc kubenswrapper[4848]: I0217 09:23:44.388397 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0e104ad7-09b3-4881-857c-0384ea158171","Type":"ContainerStarted","Data":"b47a3d41236a0740f4c13346c9369364e23f9a72fd34ce6ce673dd3299d8ad57"} Feb 17 09:23:44 crc kubenswrapper[4848]: I0217 09:23:44.402498 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.834010868 podStartE2EDuration="5.402479117s" podCreationTimestamp="2026-02-17 09:23:39 +0000 UTC" firstStartedPulling="2026-02-17 09:23:40.614168771 +0000 UTC m=+1098.157424417" lastFinishedPulling="2026-02-17 09:23:43.18263702 +0000 UTC m=+1100.725892666" observedRunningTime="2026-02-17 09:23:44.393537996 +0000 UTC m=+1101.936793712" watchObservedRunningTime="2026-02-17 09:23:44.402479117 +0000 UTC m=+1101.945734763" Feb 17 09:23:44 crc kubenswrapper[4848]: I0217 09:23:44.441223 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.742859536 podStartE2EDuration="5.441198673s" podCreationTimestamp="2026-02-17 09:23:39 +0000 UTC" firstStartedPulling="2026-02-17 09:23:40.483272306 +0000 UTC m=+1098.026527952" lastFinishedPulling="2026-02-17 09:23:43.181611443 +0000 UTC m=+1100.724867089" observedRunningTime="2026-02-17 09:23:44.428063898 +0000 UTC m=+1101.971319544" watchObservedRunningTime="2026-02-17 09:23:44.441198673 +0000 UTC m=+1101.984454319" Feb 17 09:23:44 crc kubenswrapper[4848]: I0217 09:23:44.449659 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.995458987 podStartE2EDuration="5.449646151s" podCreationTimestamp="2026-02-17 09:23:39 +0000 UTC" firstStartedPulling="2026-02-17 09:23:40.751953891 +0000 UTC m=+1098.295209537" lastFinishedPulling="2026-02-17 09:23:43.206141055 +0000 UTC m=+1100.749396701" observedRunningTime="2026-02-17 09:23:44.446714991 +0000 UTC m=+1101.989970637" watchObservedRunningTime="2026-02-17 09:23:44.449646151 +0000 UTC m=+1101.992901797" Feb 17 09:23:44 crc kubenswrapper[4848]: I0217 09:23:44.487439 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.809917138 podStartE2EDuration="5.487421801s" podCreationTimestamp="2026-02-17 09:23:39 +0000 UTC" firstStartedPulling="2026-02-17 09:23:40.496427152 +0000 UTC m=+1098.039682798" lastFinishedPulling="2026-02-17 09:23:43.173931815 +0000 UTC m=+1100.717187461" observedRunningTime="2026-02-17 09:23:44.471709786 +0000 UTC m=+1102.014965452" watchObservedRunningTime="2026-02-17 09:23:44.487421801 +0000 UTC m=+1102.030677437" Feb 17 09:23:44 crc kubenswrapper[4848]: I0217 09:23:44.872081 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 09:23:44 crc kubenswrapper[4848]: I0217 09:23:44.872355 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 09:23:44 crc kubenswrapper[4848]: I0217 09:23:44.901679 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:23:44 crc kubenswrapper[4848]: I0217 09:23:44.999591 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.020558 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.133042 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06701a3a-e547-4022-87bc-9e40af29efa4-logs\") pod \"06701a3a-e547-4022-87bc-9e40af29efa4\" (UID: \"06701a3a-e547-4022-87bc-9e40af29efa4\") " Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.133166 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06701a3a-e547-4022-87bc-9e40af29efa4-config-data\") pod \"06701a3a-e547-4022-87bc-9e40af29efa4\" (UID: \"06701a3a-e547-4022-87bc-9e40af29efa4\") " Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.133231 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06701a3a-e547-4022-87bc-9e40af29efa4-combined-ca-bundle\") pod \"06701a3a-e547-4022-87bc-9e40af29efa4\" (UID: \"06701a3a-e547-4022-87bc-9e40af29efa4\") " Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.133306 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kw5f\" (UniqueName: \"kubernetes.io/projected/06701a3a-e547-4022-87bc-9e40af29efa4-kube-api-access-5kw5f\") pod \"06701a3a-e547-4022-87bc-9e40af29efa4\" (UID: \"06701a3a-e547-4022-87bc-9e40af29efa4\") " Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.133808 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06701a3a-e547-4022-87bc-9e40af29efa4-logs" (OuterVolumeSpecName: "logs") pod "06701a3a-e547-4022-87bc-9e40af29efa4" (UID: "06701a3a-e547-4022-87bc-9e40af29efa4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.138672 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06701a3a-e547-4022-87bc-9e40af29efa4-kube-api-access-5kw5f" (OuterVolumeSpecName: "kube-api-access-5kw5f") pod "06701a3a-e547-4022-87bc-9e40af29efa4" (UID: "06701a3a-e547-4022-87bc-9e40af29efa4"). InnerVolumeSpecName "kube-api-access-5kw5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.165748 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06701a3a-e547-4022-87bc-9e40af29efa4-config-data" (OuterVolumeSpecName: "config-data") pod "06701a3a-e547-4022-87bc-9e40af29efa4" (UID: "06701a3a-e547-4022-87bc-9e40af29efa4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.166262 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06701a3a-e547-4022-87bc-9e40af29efa4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06701a3a-e547-4022-87bc-9e40af29efa4" (UID: "06701a3a-e547-4022-87bc-9e40af29efa4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.236467 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kw5f\" (UniqueName: \"kubernetes.io/projected/06701a3a-e547-4022-87bc-9e40af29efa4-kube-api-access-5kw5f\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.237008 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/06701a3a-e547-4022-87bc-9e40af29efa4-logs\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.237143 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06701a3a-e547-4022-87bc-9e40af29efa4-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.237292 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06701a3a-e547-4022-87bc-9e40af29efa4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.399601 4848 generic.go:334] "Generic (PLEG): container finished" podID="06701a3a-e547-4022-87bc-9e40af29efa4" containerID="e1f48c2c6f6c149f720e583f194e434cb3c2718f541e2e42593fca6f3c620d76" exitCode=0 Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.399637 4848 generic.go:334] "Generic (PLEG): container finished" podID="06701a3a-e547-4022-87bc-9e40af29efa4" containerID="181a3d720872d63c55b11f98944242b678b07251076ea93b9af9e22cea506330" exitCode=143 Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.400673 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"06701a3a-e547-4022-87bc-9e40af29efa4","Type":"ContainerDied","Data":"e1f48c2c6f6c149f720e583f194e434cb3c2718f541e2e42593fca6f3c620d76"} Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.400724 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"06701a3a-e547-4022-87bc-9e40af29efa4","Type":"ContainerDied","Data":"181a3d720872d63c55b11f98944242b678b07251076ea93b9af9e22cea506330"} Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.400746 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"06701a3a-e547-4022-87bc-9e40af29efa4","Type":"ContainerDied","Data":"02b23b20c8816d77e5e150ad657ba14607b7bfae53d0c1f919b597c6716b9860"} Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.400803 4848 scope.go:117] "RemoveContainer" containerID="e1f48c2c6f6c149f720e583f194e434cb3c2718f541e2e42593fca6f3c620d76" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.401347 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.426099 4848 scope.go:117] "RemoveContainer" containerID="181a3d720872d63c55b11f98944242b678b07251076ea93b9af9e22cea506330" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.449163 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.464633 4848 scope.go:117] "RemoveContainer" containerID="e1f48c2c6f6c149f720e583f194e434cb3c2718f541e2e42593fca6f3c620d76" Feb 17 09:23:45 crc kubenswrapper[4848]: E0217 09:23:45.465258 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1f48c2c6f6c149f720e583f194e434cb3c2718f541e2e42593fca6f3c620d76\": container with ID starting with e1f48c2c6f6c149f720e583f194e434cb3c2718f541e2e42593fca6f3c620d76 not found: ID does not exist" containerID="e1f48c2c6f6c149f720e583f194e434cb3c2718f541e2e42593fca6f3c620d76" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.465309 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1f48c2c6f6c149f720e583f194e434cb3c2718f541e2e42593fca6f3c620d76"} err="failed to get container status \"e1f48c2c6f6c149f720e583f194e434cb3c2718f541e2e42593fca6f3c620d76\": rpc error: code = NotFound desc = could not find container \"e1f48c2c6f6c149f720e583f194e434cb3c2718f541e2e42593fca6f3c620d76\": container with ID starting with e1f48c2c6f6c149f720e583f194e434cb3c2718f541e2e42593fca6f3c620d76 not found: ID does not exist" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.465339 4848 scope.go:117] "RemoveContainer" containerID="181a3d720872d63c55b11f98944242b678b07251076ea93b9af9e22cea506330" Feb 17 09:23:45 crc kubenswrapper[4848]: E0217 09:23:45.465960 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"181a3d720872d63c55b11f98944242b678b07251076ea93b9af9e22cea506330\": container with ID starting with 181a3d720872d63c55b11f98944242b678b07251076ea93b9af9e22cea506330 not found: ID does not exist" containerID="181a3d720872d63c55b11f98944242b678b07251076ea93b9af9e22cea506330" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.466027 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"181a3d720872d63c55b11f98944242b678b07251076ea93b9af9e22cea506330"} err="failed to get container status \"181a3d720872d63c55b11f98944242b678b07251076ea93b9af9e22cea506330\": rpc error: code = NotFound desc = could not find container \"181a3d720872d63c55b11f98944242b678b07251076ea93b9af9e22cea506330\": container with ID starting with 181a3d720872d63c55b11f98944242b678b07251076ea93b9af9e22cea506330 not found: ID does not exist" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.466072 4848 scope.go:117] "RemoveContainer" containerID="e1f48c2c6f6c149f720e583f194e434cb3c2718f541e2e42593fca6f3c620d76" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.467642 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1f48c2c6f6c149f720e583f194e434cb3c2718f541e2e42593fca6f3c620d76"} err="failed to get container status \"e1f48c2c6f6c149f720e583f194e434cb3c2718f541e2e42593fca6f3c620d76\": rpc error: code = NotFound desc = could not find container \"e1f48c2c6f6c149f720e583f194e434cb3c2718f541e2e42593fca6f3c620d76\": container with ID starting with e1f48c2c6f6c149f720e583f194e434cb3c2718f541e2e42593fca6f3c620d76 not found: ID does not exist" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.467674 4848 scope.go:117] "RemoveContainer" containerID="181a3d720872d63c55b11f98944242b678b07251076ea93b9af9e22cea506330" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.468514 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"181a3d720872d63c55b11f98944242b678b07251076ea93b9af9e22cea506330"} err="failed to get container status \"181a3d720872d63c55b11f98944242b678b07251076ea93b9af9e22cea506330\": rpc error: code = NotFound desc = could not find container \"181a3d720872d63c55b11f98944242b678b07251076ea93b9af9e22cea506330\": container with ID starting with 181a3d720872d63c55b11f98944242b678b07251076ea93b9af9e22cea506330 not found: ID does not exist" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.485471 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.495548 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 09:23:45 crc kubenswrapper[4848]: E0217 09:23:45.496144 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06701a3a-e547-4022-87bc-9e40af29efa4" containerName="nova-metadata-log" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.496168 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="06701a3a-e547-4022-87bc-9e40af29efa4" containerName="nova-metadata-log" Feb 17 09:23:45 crc kubenswrapper[4848]: E0217 09:23:45.496217 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06701a3a-e547-4022-87bc-9e40af29efa4" containerName="nova-metadata-metadata" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.496232 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="06701a3a-e547-4022-87bc-9e40af29efa4" containerName="nova-metadata-metadata" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.496472 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="06701a3a-e547-4022-87bc-9e40af29efa4" containerName="nova-metadata-log" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.496489 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="06701a3a-e547-4022-87bc-9e40af29efa4" containerName="nova-metadata-metadata" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.497733 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.501610 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.511733 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.513337 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.646592 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e0033d0-85cf-4353-a193-112d74d4671a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5e0033d0-85cf-4353-a193-112d74d4671a\") " pod="openstack/nova-metadata-0" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.646715 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7flw7\" (UniqueName: \"kubernetes.io/projected/5e0033d0-85cf-4353-a193-112d74d4671a-kube-api-access-7flw7\") pod \"nova-metadata-0\" (UID: \"5e0033d0-85cf-4353-a193-112d74d4671a\") " pod="openstack/nova-metadata-0" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.646894 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e0033d0-85cf-4353-a193-112d74d4671a-logs\") pod \"nova-metadata-0\" (UID: \"5e0033d0-85cf-4353-a193-112d74d4671a\") " pod="openstack/nova-metadata-0" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.646989 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e0033d0-85cf-4353-a193-112d74d4671a-config-data\") pod \"nova-metadata-0\" (UID: \"5e0033d0-85cf-4353-a193-112d74d4671a\") " pod="openstack/nova-metadata-0" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.647060 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e0033d0-85cf-4353-a193-112d74d4671a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e0033d0-85cf-4353-a193-112d74d4671a\") " pod="openstack/nova-metadata-0" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.748741 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e0033d0-85cf-4353-a193-112d74d4671a-config-data\") pod \"nova-metadata-0\" (UID: \"5e0033d0-85cf-4353-a193-112d74d4671a\") " pod="openstack/nova-metadata-0" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.748860 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e0033d0-85cf-4353-a193-112d74d4671a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e0033d0-85cf-4353-a193-112d74d4671a\") " pod="openstack/nova-metadata-0" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.748991 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e0033d0-85cf-4353-a193-112d74d4671a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5e0033d0-85cf-4353-a193-112d74d4671a\") " pod="openstack/nova-metadata-0" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.749084 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7flw7\" (UniqueName: \"kubernetes.io/projected/5e0033d0-85cf-4353-a193-112d74d4671a-kube-api-access-7flw7\") pod \"nova-metadata-0\" (UID: \"5e0033d0-85cf-4353-a193-112d74d4671a\") " pod="openstack/nova-metadata-0" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.749204 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e0033d0-85cf-4353-a193-112d74d4671a-logs\") pod \"nova-metadata-0\" (UID: \"5e0033d0-85cf-4353-a193-112d74d4671a\") " pod="openstack/nova-metadata-0" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.749936 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e0033d0-85cf-4353-a193-112d74d4671a-logs\") pod \"nova-metadata-0\" (UID: \"5e0033d0-85cf-4353-a193-112d74d4671a\") " pod="openstack/nova-metadata-0" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.753306 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e0033d0-85cf-4353-a193-112d74d4671a-config-data\") pod \"nova-metadata-0\" (UID: \"5e0033d0-85cf-4353-a193-112d74d4671a\") " pod="openstack/nova-metadata-0" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.753975 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e0033d0-85cf-4353-a193-112d74d4671a-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5e0033d0-85cf-4353-a193-112d74d4671a\") " pod="openstack/nova-metadata-0" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.760826 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e0033d0-85cf-4353-a193-112d74d4671a-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5e0033d0-85cf-4353-a193-112d74d4671a\") " pod="openstack/nova-metadata-0" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.775270 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7flw7\" (UniqueName: \"kubernetes.io/projected/5e0033d0-85cf-4353-a193-112d74d4671a-kube-api-access-7flw7\") pod \"nova-metadata-0\" (UID: \"5e0033d0-85cf-4353-a193-112d74d4671a\") " pod="openstack/nova-metadata-0" Feb 17 09:23:45 crc kubenswrapper[4848]: I0217 09:23:45.828586 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 09:23:46 crc kubenswrapper[4848]: I0217 09:23:46.336387 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 09:23:46 crc kubenswrapper[4848]: W0217 09:23:46.351929 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e0033d0_85cf_4353_a193_112d74d4671a.slice/crio-db251a4afee99b77cdac7b1b396c9bfd69fc783a20aca69b8c5706471e676598 WatchSource:0}: Error finding container db251a4afee99b77cdac7b1b396c9bfd69fc783a20aca69b8c5706471e676598: Status 404 returned error can't find the container with id db251a4afee99b77cdac7b1b396c9bfd69fc783a20aca69b8c5706471e676598 Feb 17 09:23:46 crc kubenswrapper[4848]: I0217 09:23:46.411153 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e0033d0-85cf-4353-a193-112d74d4671a","Type":"ContainerStarted","Data":"db251a4afee99b77cdac7b1b396c9bfd69fc783a20aca69b8c5706471e676598"} Feb 17 09:23:47 crc kubenswrapper[4848]: I0217 09:23:47.397643 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06701a3a-e547-4022-87bc-9e40af29efa4" path="/var/lib/kubelet/pods/06701a3a-e547-4022-87bc-9e40af29efa4/volumes" Feb 17 09:23:47 crc kubenswrapper[4848]: I0217 09:23:47.423552 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e0033d0-85cf-4353-a193-112d74d4671a","Type":"ContainerStarted","Data":"4ff0da98799385a561e190db0d6ad4ae7045745b6dbc91e055d7a8ba3b1f34e9"} Feb 17 09:23:47 crc kubenswrapper[4848]: I0217 09:23:47.423596 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e0033d0-85cf-4353-a193-112d74d4671a","Type":"ContainerStarted","Data":"6463e04ef5be1a619e4221d82cc9a4f5537a0b234d879686b75235b73ce7b17f"} Feb 17 09:23:47 crc kubenswrapper[4848]: I0217 09:23:47.463597 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.463566919 podStartE2EDuration="2.463566919s" podCreationTimestamp="2026-02-17 09:23:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:23:47.449393666 +0000 UTC m=+1104.992649372" watchObservedRunningTime="2026-02-17 09:23:47.463566919 +0000 UTC m=+1105.006822595" Feb 17 09:23:48 crc kubenswrapper[4848]: I0217 09:23:48.438428 4848 generic.go:334] "Generic (PLEG): container finished" podID="7a4198c9-c56f-45e9-90f7-486ddcb9d65f" containerID="13a3939801e3fe31e0d33cb6fb4431afa3e5044ec2189bea83022200dd7894fb" exitCode=0 Feb 17 09:23:48 crc kubenswrapper[4848]: I0217 09:23:48.438591 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h9vlc" event={"ID":"7a4198c9-c56f-45e9-90f7-486ddcb9d65f","Type":"ContainerDied","Data":"13a3939801e3fe31e0d33cb6fb4431afa3e5044ec2189bea83022200dd7894fb"} Feb 17 09:23:49 crc kubenswrapper[4848]: I0217 09:23:49.450337 4848 generic.go:334] "Generic (PLEG): container finished" podID="5f95863d-9560-4508-9115-5da47d8dd4c2" containerID="d5214704be6a231b3d99cc7ed4e4eb92e94a850437d836cae3b99d1131d1cb01" exitCode=0 Feb 17 09:23:49 crc kubenswrapper[4848]: I0217 09:23:49.450409 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zgwn9" event={"ID":"5f95863d-9560-4508-9115-5da47d8dd4c2","Type":"ContainerDied","Data":"d5214704be6a231b3d99cc7ed4e4eb92e94a850437d836cae3b99d1131d1cb01"} Feb 17 09:23:49 crc kubenswrapper[4848]: I0217 09:23:49.860211 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 09:23:49 crc kubenswrapper[4848]: I0217 09:23:49.860591 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 09:23:49 crc kubenswrapper[4848]: I0217 09:23:49.876651 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h9vlc" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.020522 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.038385 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a4198c9-c56f-45e9-90f7-486ddcb9d65f-scripts\") pod \"7a4198c9-c56f-45e9-90f7-486ddcb9d65f\" (UID: \"7a4198c9-c56f-45e9-90f7-486ddcb9d65f\") " Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.038525 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a4198c9-c56f-45e9-90f7-486ddcb9d65f-config-data\") pod \"7a4198c9-c56f-45e9-90f7-486ddcb9d65f\" (UID: \"7a4198c9-c56f-45e9-90f7-486ddcb9d65f\") " Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.038571 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a4198c9-c56f-45e9-90f7-486ddcb9d65f-combined-ca-bundle\") pod \"7a4198c9-c56f-45e9-90f7-486ddcb9d65f\" (UID: \"7a4198c9-c56f-45e9-90f7-486ddcb9d65f\") " Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.038608 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlhcb\" (UniqueName: \"kubernetes.io/projected/7a4198c9-c56f-45e9-90f7-486ddcb9d65f-kube-api-access-mlhcb\") pod \"7a4198c9-c56f-45e9-90f7-486ddcb9d65f\" (UID: \"7a4198c9-c56f-45e9-90f7-486ddcb9d65f\") " Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.044662 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a4198c9-c56f-45e9-90f7-486ddcb9d65f-kube-api-access-mlhcb" (OuterVolumeSpecName: "kube-api-access-mlhcb") pod "7a4198c9-c56f-45e9-90f7-486ddcb9d65f" (UID: "7a4198c9-c56f-45e9-90f7-486ddcb9d65f"). InnerVolumeSpecName "kube-api-access-mlhcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.044694 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a4198c9-c56f-45e9-90f7-486ddcb9d65f-scripts" (OuterVolumeSpecName: "scripts") pod "7a4198c9-c56f-45e9-90f7-486ddcb9d65f" (UID: "7a4198c9-c56f-45e9-90f7-486ddcb9d65f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.066030 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.067992 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a4198c9-c56f-45e9-90f7-486ddcb9d65f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a4198c9-c56f-45e9-90f7-486ddcb9d65f" (UID: "7a4198c9-c56f-45e9-90f7-486ddcb9d65f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.081789 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a4198c9-c56f-45e9-90f7-486ddcb9d65f-config-data" (OuterVolumeSpecName: "config-data") pod "7a4198c9-c56f-45e9-90f7-486ddcb9d65f" (UID: "7a4198c9-c56f-45e9-90f7-486ddcb9d65f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.119387 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.142943 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a4198c9-c56f-45e9-90f7-486ddcb9d65f-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.142981 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a4198c9-c56f-45e9-90f7-486ddcb9d65f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.142995 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlhcb\" (UniqueName: \"kubernetes.io/projected/7a4198c9-c56f-45e9-90f7-486ddcb9d65f-kube-api-access-mlhcb\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.143007 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a4198c9-c56f-45e9-90f7-486ddcb9d65f-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.203619 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dc67df487-8jdgt"] Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.203923 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" podUID="78406f6e-a154-4a75-96f6-99f77e092176" containerName="dnsmasq-dns" containerID="cri-o://a74a6b83f8ac40260aa2adfaa182c9faa6bc65f4d668463b06789b35919903b1" gracePeriod=10 Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.474652 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-h9vlc" event={"ID":"7a4198c9-c56f-45e9-90f7-486ddcb9d65f","Type":"ContainerDied","Data":"d2933fe557f6f89db33290275d95d52d92327b3979a8a03101ceb46a06ce3894"} Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.474711 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2933fe557f6f89db33290275d95d52d92327b3979a8a03101ceb46a06ce3894" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.474803 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-h9vlc" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.481273 4848 generic.go:334] "Generic (PLEG): container finished" podID="78406f6e-a154-4a75-96f6-99f77e092176" containerID="a74a6b83f8ac40260aa2adfaa182c9faa6bc65f4d668463b06789b35919903b1" exitCode=0 Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.481483 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" event={"ID":"78406f6e-a154-4a75-96f6-99f77e092176","Type":"ContainerDied","Data":"a74a6b83f8ac40260aa2adfaa182c9faa6bc65f4d668463b06789b35919903b1"} Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.515805 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.638707 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.638905 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="22f73908-5ca1-44fb-ad87-1f1b709284bf" containerName="nova-api-log" containerID="cri-o://ed0cb2c265ced71f9d2a48479837cae3e2aaf759c70ef46c03283df848fe0481" gracePeriod=30 Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.639400 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="22f73908-5ca1-44fb-ad87-1f1b709284bf" containerName="nova-api-api" containerID="cri-o://4d65192a6f64ec28bb40ebe76ae41364f9d1f4a6d89f9ebfcc6521de75dd4964" gracePeriod=30 Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.644844 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="22f73908-5ca1-44fb-ad87-1f1b709284bf" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": EOF" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.644984 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="22f73908-5ca1-44fb-ad87-1f1b709284bf" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.189:8774/\": EOF" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.655556 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.655790 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5e0033d0-85cf-4353-a193-112d74d4671a" containerName="nova-metadata-log" containerID="cri-o://6463e04ef5be1a619e4221d82cc9a4f5537a0b234d879686b75235b73ce7b17f" gracePeriod=30 Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.656168 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5e0033d0-85cf-4353-a193-112d74d4671a" containerName="nova-metadata-metadata" containerID="cri-o://4ff0da98799385a561e190db0d6ad4ae7045745b6dbc91e055d7a8ba3b1f34e9" gracePeriod=30 Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.674282 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.828755 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.829037 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.856286 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thk8g\" (UniqueName: \"kubernetes.io/projected/78406f6e-a154-4a75-96f6-99f77e092176-kube-api-access-thk8g\") pod \"78406f6e-a154-4a75-96f6-99f77e092176\" (UID: \"78406f6e-a154-4a75-96f6-99f77e092176\") " Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.856323 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-dns-svc\") pod \"78406f6e-a154-4a75-96f6-99f77e092176\" (UID: \"78406f6e-a154-4a75-96f6-99f77e092176\") " Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.856421 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-dns-swift-storage-0\") pod \"78406f6e-a154-4a75-96f6-99f77e092176\" (UID: \"78406f6e-a154-4a75-96f6-99f77e092176\") " Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.856597 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-ovsdbserver-sb\") pod \"78406f6e-a154-4a75-96f6-99f77e092176\" (UID: \"78406f6e-a154-4a75-96f6-99f77e092176\") " Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.856641 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-config\") pod \"78406f6e-a154-4a75-96f6-99f77e092176\" (UID: \"78406f6e-a154-4a75-96f6-99f77e092176\") " Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.856661 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-ovsdbserver-nb\") pod \"78406f6e-a154-4a75-96f6-99f77e092176\" (UID: \"78406f6e-a154-4a75-96f6-99f77e092176\") " Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.870555 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78406f6e-a154-4a75-96f6-99f77e092176-kube-api-access-thk8g" (OuterVolumeSpecName: "kube-api-access-thk8g") pod "78406f6e-a154-4a75-96f6-99f77e092176" (UID: "78406f6e-a154-4a75-96f6-99f77e092176"). InnerVolumeSpecName "kube-api-access-thk8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.921313 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "78406f6e-a154-4a75-96f6-99f77e092176" (UID: "78406f6e-a154-4a75-96f6-99f77e092176"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.936058 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "78406f6e-a154-4a75-96f6-99f77e092176" (UID: "78406f6e-a154-4a75-96f6-99f77e092176"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.954267 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "78406f6e-a154-4a75-96f6-99f77e092176" (UID: "78406f6e-a154-4a75-96f6-99f77e092176"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.956725 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-config" (OuterVolumeSpecName: "config") pod "78406f6e-a154-4a75-96f6-99f77e092176" (UID: "78406f6e-a154-4a75-96f6-99f77e092176"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.959354 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-thk8g\" (UniqueName: \"kubernetes.io/projected/78406f6e-a154-4a75-96f6-99f77e092176-kube-api-access-thk8g\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.959459 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.959555 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.959632 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.959714 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:50 crc kubenswrapper[4848]: I0217 09:23:50.962096 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "78406f6e-a154-4a75-96f6-99f77e092176" (UID: "78406f6e-a154-4a75-96f6-99f77e092176"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.064868 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.066938 4848 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/78406f6e-a154-4a75-96f6-99f77e092176-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.107878 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zgwn9" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.221188 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.270589 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f95863d-9560-4508-9115-5da47d8dd4c2-scripts\") pod \"5f95863d-9560-4508-9115-5da47d8dd4c2\" (UID: \"5f95863d-9560-4508-9115-5da47d8dd4c2\") " Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.270719 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f95863d-9560-4508-9115-5da47d8dd4c2-config-data\") pod \"5f95863d-9560-4508-9115-5da47d8dd4c2\" (UID: \"5f95863d-9560-4508-9115-5da47d8dd4c2\") " Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.270739 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f95863d-9560-4508-9115-5da47d8dd4c2-combined-ca-bundle\") pod \"5f95863d-9560-4508-9115-5da47d8dd4c2\" (UID: \"5f95863d-9560-4508-9115-5da47d8dd4c2\") " Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.270826 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwpbx\" (UniqueName: \"kubernetes.io/projected/5f95863d-9560-4508-9115-5da47d8dd4c2-kube-api-access-gwpbx\") pod \"5f95863d-9560-4508-9115-5da47d8dd4c2\" (UID: \"5f95863d-9560-4508-9115-5da47d8dd4c2\") " Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.276893 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f95863d-9560-4508-9115-5da47d8dd4c2-scripts" (OuterVolumeSpecName: "scripts") pod "5f95863d-9560-4508-9115-5da47d8dd4c2" (UID: "5f95863d-9560-4508-9115-5da47d8dd4c2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.278970 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f95863d-9560-4508-9115-5da47d8dd4c2-kube-api-access-gwpbx" (OuterVolumeSpecName: "kube-api-access-gwpbx") pod "5f95863d-9560-4508-9115-5da47d8dd4c2" (UID: "5f95863d-9560-4508-9115-5da47d8dd4c2"). InnerVolumeSpecName "kube-api-access-gwpbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.301657 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f95863d-9560-4508-9115-5da47d8dd4c2-config-data" (OuterVolumeSpecName: "config-data") pod "5f95863d-9560-4508-9115-5da47d8dd4c2" (UID: "5f95863d-9560-4508-9115-5da47d8dd4c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.306927 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f95863d-9560-4508-9115-5da47d8dd4c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f95863d-9560-4508-9115-5da47d8dd4c2" (UID: "5f95863d-9560-4508-9115-5da47d8dd4c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.372667 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e0033d0-85cf-4353-a193-112d74d4671a-combined-ca-bundle\") pod \"5e0033d0-85cf-4353-a193-112d74d4671a\" (UID: \"5e0033d0-85cf-4353-a193-112d74d4671a\") " Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.372803 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e0033d0-85cf-4353-a193-112d74d4671a-config-data\") pod \"5e0033d0-85cf-4353-a193-112d74d4671a\" (UID: \"5e0033d0-85cf-4353-a193-112d74d4671a\") " Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.372927 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e0033d0-85cf-4353-a193-112d74d4671a-logs\") pod \"5e0033d0-85cf-4353-a193-112d74d4671a\" (UID: \"5e0033d0-85cf-4353-a193-112d74d4671a\") " Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.373046 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7flw7\" (UniqueName: \"kubernetes.io/projected/5e0033d0-85cf-4353-a193-112d74d4671a-kube-api-access-7flw7\") pod \"5e0033d0-85cf-4353-a193-112d74d4671a\" (UID: \"5e0033d0-85cf-4353-a193-112d74d4671a\") " Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.373106 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e0033d0-85cf-4353-a193-112d74d4671a-nova-metadata-tls-certs\") pod \"5e0033d0-85cf-4353-a193-112d74d4671a\" (UID: \"5e0033d0-85cf-4353-a193-112d74d4671a\") " Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.373543 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwpbx\" (UniqueName: \"kubernetes.io/projected/5f95863d-9560-4508-9115-5da47d8dd4c2-kube-api-access-gwpbx\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.373559 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5f95863d-9560-4508-9115-5da47d8dd4c2-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.373569 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f95863d-9560-4508-9115-5da47d8dd4c2-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.373596 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f95863d-9560-4508-9115-5da47d8dd4c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.373833 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e0033d0-85cf-4353-a193-112d74d4671a-logs" (OuterVolumeSpecName: "logs") pod "5e0033d0-85cf-4353-a193-112d74d4671a" (UID: "5e0033d0-85cf-4353-a193-112d74d4671a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.384834 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e0033d0-85cf-4353-a193-112d74d4671a-kube-api-access-7flw7" (OuterVolumeSpecName: "kube-api-access-7flw7") pod "5e0033d0-85cf-4353-a193-112d74d4671a" (UID: "5e0033d0-85cf-4353-a193-112d74d4671a"). InnerVolumeSpecName "kube-api-access-7flw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.399484 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e0033d0-85cf-4353-a193-112d74d4671a-config-data" (OuterVolumeSpecName: "config-data") pod "5e0033d0-85cf-4353-a193-112d74d4671a" (UID: "5e0033d0-85cf-4353-a193-112d74d4671a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.403006 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e0033d0-85cf-4353-a193-112d74d4671a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e0033d0-85cf-4353-a193-112d74d4671a" (UID: "5e0033d0-85cf-4353-a193-112d74d4671a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.421388 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e0033d0-85cf-4353-a193-112d74d4671a-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "5e0033d0-85cf-4353-a193-112d74d4671a" (UID: "5e0033d0-85cf-4353-a193-112d74d4671a"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.475732 4848 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5e0033d0-85cf-4353-a193-112d74d4671a-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.475786 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e0033d0-85cf-4353-a193-112d74d4671a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.475798 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e0033d0-85cf-4353-a193-112d74d4671a-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.475807 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5e0033d0-85cf-4353-a193-112d74d4671a-logs\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.475817 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7flw7\" (UniqueName: \"kubernetes.io/projected/5e0033d0-85cf-4353-a193-112d74d4671a-kube-api-access-7flw7\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.518582 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-zgwn9" event={"ID":"5f95863d-9560-4508-9115-5da47d8dd4c2","Type":"ContainerDied","Data":"ceed4f850028fd13ab660b3be8e6ba8731fdb2261f35f3e612e7b1ac6135e2f8"} Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.518895 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ceed4f850028fd13ab660b3be8e6ba8731fdb2261f35f3e612e7b1ac6135e2f8" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.518629 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-zgwn9" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.520748 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" event={"ID":"78406f6e-a154-4a75-96f6-99f77e092176","Type":"ContainerDied","Data":"2ff482fa86e0c155003bf91f25e9b15b3ebeb86e1516dce17e5bc64a41848835"} Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.520827 4848 scope.go:117] "RemoveContainer" containerID="a74a6b83f8ac40260aa2adfaa182c9faa6bc65f4d668463b06789b35919903b1" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.520867 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6dc67df487-8jdgt" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.529903 4848 generic.go:334] "Generic (PLEG): container finished" podID="5e0033d0-85cf-4353-a193-112d74d4671a" containerID="4ff0da98799385a561e190db0d6ad4ae7045745b6dbc91e055d7a8ba3b1f34e9" exitCode=0 Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.529945 4848 generic.go:334] "Generic (PLEG): container finished" podID="5e0033d0-85cf-4353-a193-112d74d4671a" containerID="6463e04ef5be1a619e4221d82cc9a4f5537a0b234d879686b75235b73ce7b17f" exitCode=143 Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.530059 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.529996 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e0033d0-85cf-4353-a193-112d74d4671a","Type":"ContainerDied","Data":"4ff0da98799385a561e190db0d6ad4ae7045745b6dbc91e055d7a8ba3b1f34e9"} Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.530115 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e0033d0-85cf-4353-a193-112d74d4671a","Type":"ContainerDied","Data":"6463e04ef5be1a619e4221d82cc9a4f5537a0b234d879686b75235b73ce7b17f"} Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.530130 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5e0033d0-85cf-4353-a193-112d74d4671a","Type":"ContainerDied","Data":"db251a4afee99b77cdac7b1b396c9bfd69fc783a20aca69b8c5706471e676598"} Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.546650 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6dc67df487-8jdgt"] Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.552260 4848 generic.go:334] "Generic (PLEG): container finished" podID="22f73908-5ca1-44fb-ad87-1f1b709284bf" containerID="ed0cb2c265ced71f9d2a48479837cae3e2aaf759c70ef46c03283df848fe0481" exitCode=143 Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.552430 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22f73908-5ca1-44fb-ad87-1f1b709284bf","Type":"ContainerDied","Data":"ed0cb2c265ced71f9d2a48479837cae3e2aaf759c70ef46c03283df848fe0481"} Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.556817 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 09:23:51 crc kubenswrapper[4848]: E0217 09:23:51.557201 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e0033d0-85cf-4353-a193-112d74d4671a" containerName="nova-metadata-log" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.557213 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0033d0-85cf-4353-a193-112d74d4671a" containerName="nova-metadata-log" Feb 17 09:23:51 crc kubenswrapper[4848]: E0217 09:23:51.557234 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f95863d-9560-4508-9115-5da47d8dd4c2" containerName="nova-cell1-conductor-db-sync" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.557240 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f95863d-9560-4508-9115-5da47d8dd4c2" containerName="nova-cell1-conductor-db-sync" Feb 17 09:23:51 crc kubenswrapper[4848]: E0217 09:23:51.557253 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a4198c9-c56f-45e9-90f7-486ddcb9d65f" containerName="nova-manage" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.557259 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a4198c9-c56f-45e9-90f7-486ddcb9d65f" containerName="nova-manage" Feb 17 09:23:51 crc kubenswrapper[4848]: E0217 09:23:51.557271 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78406f6e-a154-4a75-96f6-99f77e092176" containerName="dnsmasq-dns" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.557277 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="78406f6e-a154-4a75-96f6-99f77e092176" containerName="dnsmasq-dns" Feb 17 09:23:51 crc kubenswrapper[4848]: E0217 09:23:51.557291 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e0033d0-85cf-4353-a193-112d74d4671a" containerName="nova-metadata-metadata" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.557296 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0033d0-85cf-4353-a193-112d74d4671a" containerName="nova-metadata-metadata" Feb 17 09:23:51 crc kubenswrapper[4848]: E0217 09:23:51.557310 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78406f6e-a154-4a75-96f6-99f77e092176" containerName="init" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.557317 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="78406f6e-a154-4a75-96f6-99f77e092176" containerName="init" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.557491 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f95863d-9560-4508-9115-5da47d8dd4c2" containerName="nova-cell1-conductor-db-sync" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.557507 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e0033d0-85cf-4353-a193-112d74d4671a" containerName="nova-metadata-log" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.557516 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a4198c9-c56f-45e9-90f7-486ddcb9d65f" containerName="nova-manage" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.557529 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="78406f6e-a154-4a75-96f6-99f77e092176" containerName="dnsmasq-dns" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.557543 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e0033d0-85cf-4353-a193-112d74d4671a" containerName="nova-metadata-metadata" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.558113 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.560290 4848 scope.go:117] "RemoveContainer" containerID="27c292dc0f347a6aec6815206c97764234ac510790a8a3434e2edda99e2a412d" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.560827 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.580612 4848 scope.go:117] "RemoveContainer" containerID="4ff0da98799385a561e190db0d6ad4ae7045745b6dbc91e055d7a8ba3b1f34e9" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.599915 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6dc67df487-8jdgt"] Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.616777 4848 scope.go:117] "RemoveContainer" containerID="6463e04ef5be1a619e4221d82cc9a4f5537a0b234d879686b75235b73ce7b17f" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.618533 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.630893 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.638814 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.647387 4848 scope.go:117] "RemoveContainer" containerID="4ff0da98799385a561e190db0d6ad4ae7045745b6dbc91e055d7a8ba3b1f34e9" Feb 17 09:23:51 crc kubenswrapper[4848]: E0217 09:23:51.647707 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ff0da98799385a561e190db0d6ad4ae7045745b6dbc91e055d7a8ba3b1f34e9\": container with ID starting with 4ff0da98799385a561e190db0d6ad4ae7045745b6dbc91e055d7a8ba3b1f34e9 not found: ID does not exist" containerID="4ff0da98799385a561e190db0d6ad4ae7045745b6dbc91e055d7a8ba3b1f34e9" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.647735 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ff0da98799385a561e190db0d6ad4ae7045745b6dbc91e055d7a8ba3b1f34e9"} err="failed to get container status \"4ff0da98799385a561e190db0d6ad4ae7045745b6dbc91e055d7a8ba3b1f34e9\": rpc error: code = NotFound desc = could not find container \"4ff0da98799385a561e190db0d6ad4ae7045745b6dbc91e055d7a8ba3b1f34e9\": container with ID starting with 4ff0da98799385a561e190db0d6ad4ae7045745b6dbc91e055d7a8ba3b1f34e9 not found: ID does not exist" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.647769 4848 scope.go:117] "RemoveContainer" containerID="6463e04ef5be1a619e4221d82cc9a4f5537a0b234d879686b75235b73ce7b17f" Feb 17 09:23:51 crc kubenswrapper[4848]: E0217 09:23:51.647964 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6463e04ef5be1a619e4221d82cc9a4f5537a0b234d879686b75235b73ce7b17f\": container with ID starting with 6463e04ef5be1a619e4221d82cc9a4f5537a0b234d879686b75235b73ce7b17f not found: ID does not exist" containerID="6463e04ef5be1a619e4221d82cc9a4f5537a0b234d879686b75235b73ce7b17f" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.647982 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6463e04ef5be1a619e4221d82cc9a4f5537a0b234d879686b75235b73ce7b17f"} err="failed to get container status \"6463e04ef5be1a619e4221d82cc9a4f5537a0b234d879686b75235b73ce7b17f\": rpc error: code = NotFound desc = could not find container \"6463e04ef5be1a619e4221d82cc9a4f5537a0b234d879686b75235b73ce7b17f\": container with ID starting with 6463e04ef5be1a619e4221d82cc9a4f5537a0b234d879686b75235b73ce7b17f not found: ID does not exist" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.647996 4848 scope.go:117] "RemoveContainer" containerID="4ff0da98799385a561e190db0d6ad4ae7045745b6dbc91e055d7a8ba3b1f34e9" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.648380 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ff0da98799385a561e190db0d6ad4ae7045745b6dbc91e055d7a8ba3b1f34e9"} err="failed to get container status \"4ff0da98799385a561e190db0d6ad4ae7045745b6dbc91e055d7a8ba3b1f34e9\": rpc error: code = NotFound desc = could not find container \"4ff0da98799385a561e190db0d6ad4ae7045745b6dbc91e055d7a8ba3b1f34e9\": container with ID starting with 4ff0da98799385a561e190db0d6ad4ae7045745b6dbc91e055d7a8ba3b1f34e9 not found: ID does not exist" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.648428 4848 scope.go:117] "RemoveContainer" containerID="6463e04ef5be1a619e4221d82cc9a4f5537a0b234d879686b75235b73ce7b17f" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.648695 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6463e04ef5be1a619e4221d82cc9a4f5537a0b234d879686b75235b73ce7b17f"} err="failed to get container status \"6463e04ef5be1a619e4221d82cc9a4f5537a0b234d879686b75235b73ce7b17f\": rpc error: code = NotFound desc = could not find container \"6463e04ef5be1a619e4221d82cc9a4f5537a0b234d879686b75235b73ce7b17f\": container with ID starting with 6463e04ef5be1a619e4221d82cc9a4f5537a0b234d879686b75235b73ce7b17f not found: ID does not exist" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.657525 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.659085 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.662267 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.662321 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.665722 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.689397 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/828cf207-286c-427a-80f4-5713b1128ecc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"828cf207-286c-427a-80f4-5713b1128ecc\") " pod="openstack/nova-cell1-conductor-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.689560 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828cf207-286c-427a-80f4-5713b1128ecc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"828cf207-286c-427a-80f4-5713b1128ecc\") " pod="openstack/nova-cell1-conductor-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.689618 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xjzz\" (UniqueName: \"kubernetes.io/projected/828cf207-286c-427a-80f4-5713b1128ecc-kube-api-access-6xjzz\") pod \"nova-cell1-conductor-0\" (UID: \"828cf207-286c-427a-80f4-5713b1128ecc\") " pod="openstack/nova-cell1-conductor-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.791699 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/828cf207-286c-427a-80f4-5713b1128ecc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"828cf207-286c-427a-80f4-5713b1128ecc\") " pod="openstack/nova-cell1-conductor-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.791797 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828cf207-286c-427a-80f4-5713b1128ecc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"828cf207-286c-427a-80f4-5713b1128ecc\") " pod="openstack/nova-cell1-conductor-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.791843 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xjzz\" (UniqueName: \"kubernetes.io/projected/828cf207-286c-427a-80f4-5713b1128ecc-kube-api-access-6xjzz\") pod \"nova-cell1-conductor-0\" (UID: \"828cf207-286c-427a-80f4-5713b1128ecc\") " pod="openstack/nova-cell1-conductor-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.791878 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71d697a6-85b3-4801-8521-c72a378cbed0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"71d697a6-85b3-4801-8521-c72a378cbed0\") " pod="openstack/nova-metadata-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.791898 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71d697a6-85b3-4801-8521-c72a378cbed0-config-data\") pod \"nova-metadata-0\" (UID: \"71d697a6-85b3-4801-8521-c72a378cbed0\") " pod="openstack/nova-metadata-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.791931 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71d697a6-85b3-4801-8521-c72a378cbed0-logs\") pod \"nova-metadata-0\" (UID: \"71d697a6-85b3-4801-8521-c72a378cbed0\") " pod="openstack/nova-metadata-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.791953 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxrkg\" (UniqueName: \"kubernetes.io/projected/71d697a6-85b3-4801-8521-c72a378cbed0-kube-api-access-zxrkg\") pod \"nova-metadata-0\" (UID: \"71d697a6-85b3-4801-8521-c72a378cbed0\") " pod="openstack/nova-metadata-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.792011 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71d697a6-85b3-4801-8521-c72a378cbed0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"71d697a6-85b3-4801-8521-c72a378cbed0\") " pod="openstack/nova-metadata-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.796544 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/828cf207-286c-427a-80f4-5713b1128ecc-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"828cf207-286c-427a-80f4-5713b1128ecc\") " pod="openstack/nova-cell1-conductor-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.797259 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/828cf207-286c-427a-80f4-5713b1128ecc-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"828cf207-286c-427a-80f4-5713b1128ecc\") " pod="openstack/nova-cell1-conductor-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.809357 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xjzz\" (UniqueName: \"kubernetes.io/projected/828cf207-286c-427a-80f4-5713b1128ecc-kube-api-access-6xjzz\") pod \"nova-cell1-conductor-0\" (UID: \"828cf207-286c-427a-80f4-5713b1128ecc\") " pod="openstack/nova-cell1-conductor-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.877445 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.893751 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71d697a6-85b3-4801-8521-c72a378cbed0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"71d697a6-85b3-4801-8521-c72a378cbed0\") " pod="openstack/nova-metadata-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.893973 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71d697a6-85b3-4801-8521-c72a378cbed0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"71d697a6-85b3-4801-8521-c72a378cbed0\") " pod="openstack/nova-metadata-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.894004 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71d697a6-85b3-4801-8521-c72a378cbed0-config-data\") pod \"nova-metadata-0\" (UID: \"71d697a6-85b3-4801-8521-c72a378cbed0\") " pod="openstack/nova-metadata-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.894589 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71d697a6-85b3-4801-8521-c72a378cbed0-logs\") pod \"nova-metadata-0\" (UID: \"71d697a6-85b3-4801-8521-c72a378cbed0\") " pod="openstack/nova-metadata-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.894627 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxrkg\" (UniqueName: \"kubernetes.io/projected/71d697a6-85b3-4801-8521-c72a378cbed0-kube-api-access-zxrkg\") pod \"nova-metadata-0\" (UID: \"71d697a6-85b3-4801-8521-c72a378cbed0\") " pod="openstack/nova-metadata-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.894933 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71d697a6-85b3-4801-8521-c72a378cbed0-logs\") pod \"nova-metadata-0\" (UID: \"71d697a6-85b3-4801-8521-c72a378cbed0\") " pod="openstack/nova-metadata-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.897558 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71d697a6-85b3-4801-8521-c72a378cbed0-config-data\") pod \"nova-metadata-0\" (UID: \"71d697a6-85b3-4801-8521-c72a378cbed0\") " pod="openstack/nova-metadata-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.897739 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71d697a6-85b3-4801-8521-c72a378cbed0-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"71d697a6-85b3-4801-8521-c72a378cbed0\") " pod="openstack/nova-metadata-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.898851 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71d697a6-85b3-4801-8521-c72a378cbed0-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"71d697a6-85b3-4801-8521-c72a378cbed0\") " pod="openstack/nova-metadata-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.911657 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxrkg\" (UniqueName: \"kubernetes.io/projected/71d697a6-85b3-4801-8521-c72a378cbed0-kube-api-access-zxrkg\") pod \"nova-metadata-0\" (UID: \"71d697a6-85b3-4801-8521-c72a378cbed0\") " pod="openstack/nova-metadata-0" Feb 17 09:23:51 crc kubenswrapper[4848]: I0217 09:23:51.972998 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 09:23:52 crc kubenswrapper[4848]: I0217 09:23:52.335947 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 17 09:23:52 crc kubenswrapper[4848]: I0217 09:23:52.488632 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 09:23:52 crc kubenswrapper[4848]: W0217 09:23:52.490192 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71d697a6_85b3_4801_8521_c72a378cbed0.slice/crio-60104c804a9be6f4ad92f4c22895d851f04632814a2d995069bbb01d456add3c WatchSource:0}: Error finding container 60104c804a9be6f4ad92f4c22895d851f04632814a2d995069bbb01d456add3c: Status 404 returned error can't find the container with id 60104c804a9be6f4ad92f4c22895d851f04632814a2d995069bbb01d456add3c Feb 17 09:23:52 crc kubenswrapper[4848]: I0217 09:23:52.561311 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"828cf207-286c-427a-80f4-5713b1128ecc","Type":"ContainerStarted","Data":"7bc01229ced692c0b5d635925b3e59ef139ffe55122dc5ca4562208e10018b93"} Feb 17 09:23:52 crc kubenswrapper[4848]: I0217 09:23:52.561354 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"828cf207-286c-427a-80f4-5713b1128ecc","Type":"ContainerStarted","Data":"594d68d601696870ec859fbcb78aa1e16238fdb90ac7cc6fc7f726f151cf3805"} Feb 17 09:23:52 crc kubenswrapper[4848]: I0217 09:23:52.562522 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 17 09:23:52 crc kubenswrapper[4848]: I0217 09:23:52.564921 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71d697a6-85b3-4801-8521-c72a378cbed0","Type":"ContainerStarted","Data":"60104c804a9be6f4ad92f4c22895d851f04632814a2d995069bbb01d456add3c"} Feb 17 09:23:52 crc kubenswrapper[4848]: I0217 09:23:52.567482 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="0e104ad7-09b3-4881-857c-0384ea158171" containerName="nova-scheduler-scheduler" containerID="cri-o://b47a3d41236a0740f4c13346c9369364e23f9a72fd34ce6ce673dd3299d8ad57" gracePeriod=30 Feb 17 09:23:52 crc kubenswrapper[4848]: I0217 09:23:52.593482 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.593444318 podStartE2EDuration="1.593444318s" podCreationTimestamp="2026-02-17 09:23:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:23:52.583807717 +0000 UTC m=+1110.127063363" watchObservedRunningTime="2026-02-17 09:23:52.593444318 +0000 UTC m=+1110.136699964" Feb 17 09:23:53 crc kubenswrapper[4848]: I0217 09:23:53.402838 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e0033d0-85cf-4353-a193-112d74d4671a" path="/var/lib/kubelet/pods/5e0033d0-85cf-4353-a193-112d74d4671a/volumes" Feb 17 09:23:53 crc kubenswrapper[4848]: I0217 09:23:53.405259 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78406f6e-a154-4a75-96f6-99f77e092176" path="/var/lib/kubelet/pods/78406f6e-a154-4a75-96f6-99f77e092176/volumes" Feb 17 09:23:53 crc kubenswrapper[4848]: I0217 09:23:53.584146 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71d697a6-85b3-4801-8521-c72a378cbed0","Type":"ContainerStarted","Data":"170ec1f977a1bcdd85999328d83122ee1b715ebb6e69c1813f9e8199a7656dc4"} Feb 17 09:23:53 crc kubenswrapper[4848]: I0217 09:23:53.584211 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71d697a6-85b3-4801-8521-c72a378cbed0","Type":"ContainerStarted","Data":"35e1bc5319f656b679e649c2f3db1a5c380f56716ba083cbe71268e5a8f337c6"} Feb 17 09:23:53 crc kubenswrapper[4848]: I0217 09:23:53.627697 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.627666633 podStartE2EDuration="2.627666633s" podCreationTimestamp="2026-02-17 09:23:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:23:53.619076131 +0000 UTC m=+1111.162331827" watchObservedRunningTime="2026-02-17 09:23:53.627666633 +0000 UTC m=+1111.170922319" Feb 17 09:23:55 crc kubenswrapper[4848]: E0217 09:23:55.024360 4848 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b47a3d41236a0740f4c13346c9369364e23f9a72fd34ce6ce673dd3299d8ad57 is running failed: container process not found" containerID="b47a3d41236a0740f4c13346c9369364e23f9a72fd34ce6ce673dd3299d8ad57" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 09:23:55 crc kubenswrapper[4848]: E0217 09:23:55.027894 4848 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b47a3d41236a0740f4c13346c9369364e23f9a72fd34ce6ce673dd3299d8ad57 is running failed: container process not found" containerID="b47a3d41236a0740f4c13346c9369364e23f9a72fd34ce6ce673dd3299d8ad57" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 09:23:55 crc kubenswrapper[4848]: E0217 09:23:55.028238 4848 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b47a3d41236a0740f4c13346c9369364e23f9a72fd34ce6ce673dd3299d8ad57 is running failed: container process not found" containerID="b47a3d41236a0740f4c13346c9369364e23f9a72fd34ce6ce673dd3299d8ad57" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 09:23:55 crc kubenswrapper[4848]: E0217 09:23:55.028270 4848 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b47a3d41236a0740f4c13346c9369364e23f9a72fd34ce6ce673dd3299d8ad57 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="0e104ad7-09b3-4881-857c-0384ea158171" containerName="nova-scheduler-scheduler" Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.282767 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.364564 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8km9p\" (UniqueName: \"kubernetes.io/projected/0e104ad7-09b3-4881-857c-0384ea158171-kube-api-access-8km9p\") pod \"0e104ad7-09b3-4881-857c-0384ea158171\" (UID: \"0e104ad7-09b3-4881-857c-0384ea158171\") " Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.364810 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e104ad7-09b3-4881-857c-0384ea158171-combined-ca-bundle\") pod \"0e104ad7-09b3-4881-857c-0384ea158171\" (UID: \"0e104ad7-09b3-4881-857c-0384ea158171\") " Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.365917 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e104ad7-09b3-4881-857c-0384ea158171-config-data\") pod \"0e104ad7-09b3-4881-857c-0384ea158171\" (UID: \"0e104ad7-09b3-4881-857c-0384ea158171\") " Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.390619 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e104ad7-09b3-4881-857c-0384ea158171-kube-api-access-8km9p" (OuterVolumeSpecName: "kube-api-access-8km9p") pod "0e104ad7-09b3-4881-857c-0384ea158171" (UID: "0e104ad7-09b3-4881-857c-0384ea158171"). InnerVolumeSpecName "kube-api-access-8km9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.413832 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e104ad7-09b3-4881-857c-0384ea158171-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e104ad7-09b3-4881-857c-0384ea158171" (UID: "0e104ad7-09b3-4881-857c-0384ea158171"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.415822 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e104ad7-09b3-4881-857c-0384ea158171-config-data" (OuterVolumeSpecName: "config-data") pod "0e104ad7-09b3-4881-857c-0384ea158171" (UID: "0e104ad7-09b3-4881-857c-0384ea158171"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.468417 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0e104ad7-09b3-4881-857c-0384ea158171-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.468456 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8km9p\" (UniqueName: \"kubernetes.io/projected/0e104ad7-09b3-4881-857c-0384ea158171-kube-api-access-8km9p\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.468473 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e104ad7-09b3-4881-857c-0384ea158171-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.642445 4848 generic.go:334] "Generic (PLEG): container finished" podID="0e104ad7-09b3-4881-857c-0384ea158171" containerID="b47a3d41236a0740f4c13346c9369364e23f9a72fd34ce6ce673dd3299d8ad57" exitCode=0 Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.642627 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0e104ad7-09b3-4881-857c-0384ea158171","Type":"ContainerDied","Data":"b47a3d41236a0740f4c13346c9369364e23f9a72fd34ce6ce673dd3299d8ad57"} Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.642938 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"0e104ad7-09b3-4881-857c-0384ea158171","Type":"ContainerDied","Data":"6ad283ac171c331651cb0b4485d0a010232e902c99be20abf3f22c5c6c951855"} Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.642974 4848 scope.go:117] "RemoveContainer" containerID="b47a3d41236a0740f4c13346c9369364e23f9a72fd34ce6ce673dd3299d8ad57" Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.642809 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.674138 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.684806 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.695410 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 09:23:55 crc kubenswrapper[4848]: E0217 09:23:55.695878 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e104ad7-09b3-4881-857c-0384ea158171" containerName="nova-scheduler-scheduler" Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.695897 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e104ad7-09b3-4881-857c-0384ea158171" containerName="nova-scheduler-scheduler" Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.696106 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e104ad7-09b3-4881-857c-0384ea158171" containerName="nova-scheduler-scheduler" Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.696866 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.700535 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.707430 4848 scope.go:117] "RemoveContainer" containerID="b47a3d41236a0740f4c13346c9369364e23f9a72fd34ce6ce673dd3299d8ad57" Feb 17 09:23:55 crc kubenswrapper[4848]: E0217 09:23:55.708245 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b47a3d41236a0740f4c13346c9369364e23f9a72fd34ce6ce673dd3299d8ad57\": container with ID starting with b47a3d41236a0740f4c13346c9369364e23f9a72fd34ce6ce673dd3299d8ad57 not found: ID does not exist" containerID="b47a3d41236a0740f4c13346c9369364e23f9a72fd34ce6ce673dd3299d8ad57" Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.708287 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b47a3d41236a0740f4c13346c9369364e23f9a72fd34ce6ce673dd3299d8ad57"} err="failed to get container status \"b47a3d41236a0740f4c13346c9369364e23f9a72fd34ce6ce673dd3299d8ad57\": rpc error: code = NotFound desc = could not find container \"b47a3d41236a0740f4c13346c9369364e23f9a72fd34ce6ce673dd3299d8ad57\": container with ID starting with b47a3d41236a0740f4c13346c9369364e23f9a72fd34ce6ce673dd3299d8ad57 not found: ID does not exist" Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.712165 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.773325 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mb22\" (UniqueName: \"kubernetes.io/projected/2f1b1861-5e43-493d-a3de-9686bac8fb14-kube-api-access-8mb22\") pod \"nova-scheduler-0\" (UID: \"2f1b1861-5e43-493d-a3de-9686bac8fb14\") " pod="openstack/nova-scheduler-0" Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.773413 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f1b1861-5e43-493d-a3de-9686bac8fb14-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2f1b1861-5e43-493d-a3de-9686bac8fb14\") " pod="openstack/nova-scheduler-0" Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.773567 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f1b1861-5e43-493d-a3de-9686bac8fb14-config-data\") pod \"nova-scheduler-0\" (UID: \"2f1b1861-5e43-493d-a3de-9686bac8fb14\") " pod="openstack/nova-scheduler-0" Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.875160 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f1b1861-5e43-493d-a3de-9686bac8fb14-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2f1b1861-5e43-493d-a3de-9686bac8fb14\") " pod="openstack/nova-scheduler-0" Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.875229 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f1b1861-5e43-493d-a3de-9686bac8fb14-config-data\") pod \"nova-scheduler-0\" (UID: \"2f1b1861-5e43-493d-a3de-9686bac8fb14\") " pod="openstack/nova-scheduler-0" Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.875339 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mb22\" (UniqueName: \"kubernetes.io/projected/2f1b1861-5e43-493d-a3de-9686bac8fb14-kube-api-access-8mb22\") pod \"nova-scheduler-0\" (UID: \"2f1b1861-5e43-493d-a3de-9686bac8fb14\") " pod="openstack/nova-scheduler-0" Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.879194 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f1b1861-5e43-493d-a3de-9686bac8fb14-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2f1b1861-5e43-493d-a3de-9686bac8fb14\") " pod="openstack/nova-scheduler-0" Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.880677 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f1b1861-5e43-493d-a3de-9686bac8fb14-config-data\") pod \"nova-scheduler-0\" (UID: \"2f1b1861-5e43-493d-a3de-9686bac8fb14\") " pod="openstack/nova-scheduler-0" Feb 17 09:23:55 crc kubenswrapper[4848]: I0217 09:23:55.890164 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mb22\" (UniqueName: \"kubernetes.io/projected/2f1b1861-5e43-493d-a3de-9686bac8fb14-kube-api-access-8mb22\") pod \"nova-scheduler-0\" (UID: \"2f1b1861-5e43-493d-a3de-9686bac8fb14\") " pod="openstack/nova-scheduler-0" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.023902 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.493373 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 09:23:56 crc kubenswrapper[4848]: W0217 09:23:56.499286 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f1b1861_5e43_493d_a3de_9686bac8fb14.slice/crio-69c6ded57a6924b8ddd4f92b763938d01d6e449840912c5cc8621e2a0e4aecab WatchSource:0}: Error finding container 69c6ded57a6924b8ddd4f92b763938d01d6e449840912c5cc8621e2a0e4aecab: Status 404 returned error can't find the container with id 69c6ded57a6924b8ddd4f92b763938d01d6e449840912c5cc8621e2a0e4aecab Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.541815 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.541951 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.588617 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22f73908-5ca1-44fb-ad87-1f1b709284bf-logs\") pod \"22f73908-5ca1-44fb-ad87-1f1b709284bf\" (UID: \"22f73908-5ca1-44fb-ad87-1f1b709284bf\") " Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.589036 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmdn9\" (UniqueName: \"kubernetes.io/projected/22f73908-5ca1-44fb-ad87-1f1b709284bf-kube-api-access-wmdn9\") pod \"22f73908-5ca1-44fb-ad87-1f1b709284bf\" (UID: \"22f73908-5ca1-44fb-ad87-1f1b709284bf\") " Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.589062 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22f73908-5ca1-44fb-ad87-1f1b709284bf-combined-ca-bundle\") pod \"22f73908-5ca1-44fb-ad87-1f1b709284bf\" (UID: \"22f73908-5ca1-44fb-ad87-1f1b709284bf\") " Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.589213 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22f73908-5ca1-44fb-ad87-1f1b709284bf-config-data\") pod \"22f73908-5ca1-44fb-ad87-1f1b709284bf\" (UID: \"22f73908-5ca1-44fb-ad87-1f1b709284bf\") " Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.589206 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22f73908-5ca1-44fb-ad87-1f1b709284bf-logs" (OuterVolumeSpecName: "logs") pod "22f73908-5ca1-44fb-ad87-1f1b709284bf" (UID: "22f73908-5ca1-44fb-ad87-1f1b709284bf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.589718 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22f73908-5ca1-44fb-ad87-1f1b709284bf-logs\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.594423 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22f73908-5ca1-44fb-ad87-1f1b709284bf-kube-api-access-wmdn9" (OuterVolumeSpecName: "kube-api-access-wmdn9") pod "22f73908-5ca1-44fb-ad87-1f1b709284bf" (UID: "22f73908-5ca1-44fb-ad87-1f1b709284bf"). InnerVolumeSpecName "kube-api-access-wmdn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.628993 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22f73908-5ca1-44fb-ad87-1f1b709284bf-config-data" (OuterVolumeSpecName: "config-data") pod "22f73908-5ca1-44fb-ad87-1f1b709284bf" (UID: "22f73908-5ca1-44fb-ad87-1f1b709284bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.635917 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22f73908-5ca1-44fb-ad87-1f1b709284bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22f73908-5ca1-44fb-ad87-1f1b709284bf" (UID: "22f73908-5ca1-44fb-ad87-1f1b709284bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.656666 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2f1b1861-5e43-493d-a3de-9686bac8fb14","Type":"ContainerStarted","Data":"69c6ded57a6924b8ddd4f92b763938d01d6e449840912c5cc8621e2a0e4aecab"} Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.659250 4848 generic.go:334] "Generic (PLEG): container finished" podID="22f73908-5ca1-44fb-ad87-1f1b709284bf" containerID="4d65192a6f64ec28bb40ebe76ae41364f9d1f4a6d89f9ebfcc6521de75dd4964" exitCode=0 Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.659309 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.659334 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22f73908-5ca1-44fb-ad87-1f1b709284bf","Type":"ContainerDied","Data":"4d65192a6f64ec28bb40ebe76ae41364f9d1f4a6d89f9ebfcc6521de75dd4964"} Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.659532 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"22f73908-5ca1-44fb-ad87-1f1b709284bf","Type":"ContainerDied","Data":"7f6729c76284594877686b5d086b5a4ffb30ad4216f401cc301e3996a7a006e3"} Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.659554 4848 scope.go:117] "RemoveContainer" containerID="4d65192a6f64ec28bb40ebe76ae41364f9d1f4a6d89f9ebfcc6521de75dd4964" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.687305 4848 scope.go:117] "RemoveContainer" containerID="ed0cb2c265ced71f9d2a48479837cae3e2aaf759c70ef46c03283df848fe0481" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.691049 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmdn9\" (UniqueName: \"kubernetes.io/projected/22f73908-5ca1-44fb-ad87-1f1b709284bf-kube-api-access-wmdn9\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.691078 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22f73908-5ca1-44fb-ad87-1f1b709284bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.691090 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22f73908-5ca1-44fb-ad87-1f1b709284bf-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.714256 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.731753 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.733196 4848 scope.go:117] "RemoveContainer" containerID="4d65192a6f64ec28bb40ebe76ae41364f9d1f4a6d89f9ebfcc6521de75dd4964" Feb 17 09:23:56 crc kubenswrapper[4848]: E0217 09:23:56.733646 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d65192a6f64ec28bb40ebe76ae41364f9d1f4a6d89f9ebfcc6521de75dd4964\": container with ID starting with 4d65192a6f64ec28bb40ebe76ae41364f9d1f4a6d89f9ebfcc6521de75dd4964 not found: ID does not exist" containerID="4d65192a6f64ec28bb40ebe76ae41364f9d1f4a6d89f9ebfcc6521de75dd4964" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.733692 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d65192a6f64ec28bb40ebe76ae41364f9d1f4a6d89f9ebfcc6521de75dd4964"} err="failed to get container status \"4d65192a6f64ec28bb40ebe76ae41364f9d1f4a6d89f9ebfcc6521de75dd4964\": rpc error: code = NotFound desc = could not find container \"4d65192a6f64ec28bb40ebe76ae41364f9d1f4a6d89f9ebfcc6521de75dd4964\": container with ID starting with 4d65192a6f64ec28bb40ebe76ae41364f9d1f4a6d89f9ebfcc6521de75dd4964 not found: ID does not exist" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.733721 4848 scope.go:117] "RemoveContainer" containerID="ed0cb2c265ced71f9d2a48479837cae3e2aaf759c70ef46c03283df848fe0481" Feb 17 09:23:56 crc kubenswrapper[4848]: E0217 09:23:56.734042 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed0cb2c265ced71f9d2a48479837cae3e2aaf759c70ef46c03283df848fe0481\": container with ID starting with ed0cb2c265ced71f9d2a48479837cae3e2aaf759c70ef46c03283df848fe0481 not found: ID does not exist" containerID="ed0cb2c265ced71f9d2a48479837cae3e2aaf759c70ef46c03283df848fe0481" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.734094 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed0cb2c265ced71f9d2a48479837cae3e2aaf759c70ef46c03283df848fe0481"} err="failed to get container status \"ed0cb2c265ced71f9d2a48479837cae3e2aaf759c70ef46c03283df848fe0481\": rpc error: code = NotFound desc = could not find container \"ed0cb2c265ced71f9d2a48479837cae3e2aaf759c70ef46c03283df848fe0481\": container with ID starting with ed0cb2c265ced71f9d2a48479837cae3e2aaf759c70ef46c03283df848fe0481 not found: ID does not exist" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.742501 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 09:23:56 crc kubenswrapper[4848]: E0217 09:23:56.742912 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22f73908-5ca1-44fb-ad87-1f1b709284bf" containerName="nova-api-api" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.742932 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f73908-5ca1-44fb-ad87-1f1b709284bf" containerName="nova-api-api" Feb 17 09:23:56 crc kubenswrapper[4848]: E0217 09:23:56.742988 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22f73908-5ca1-44fb-ad87-1f1b709284bf" containerName="nova-api-log" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.742996 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="22f73908-5ca1-44fb-ad87-1f1b709284bf" containerName="nova-api-log" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.743154 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="22f73908-5ca1-44fb-ad87-1f1b709284bf" containerName="nova-api-api" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.743184 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="22f73908-5ca1-44fb-ad87-1f1b709284bf" containerName="nova-api-log" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.744082 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.747333 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.752180 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.793610 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d10bd453-8470-4867-b631-e5beac75fd90-config-data\") pod \"nova-api-0\" (UID: \"d10bd453-8470-4867-b631-e5beac75fd90\") " pod="openstack/nova-api-0" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.793788 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d10bd453-8470-4867-b631-e5beac75fd90-logs\") pod \"nova-api-0\" (UID: \"d10bd453-8470-4867-b631-e5beac75fd90\") " pod="openstack/nova-api-0" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.793931 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d10bd453-8470-4867-b631-e5beac75fd90-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d10bd453-8470-4867-b631-e5beac75fd90\") " pod="openstack/nova-api-0" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.793977 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mn7f\" (UniqueName: \"kubernetes.io/projected/d10bd453-8470-4867-b631-e5beac75fd90-kube-api-access-7mn7f\") pod \"nova-api-0\" (UID: \"d10bd453-8470-4867-b631-e5beac75fd90\") " pod="openstack/nova-api-0" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.896211 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d10bd453-8470-4867-b631-e5beac75fd90-config-data\") pod \"nova-api-0\" (UID: \"d10bd453-8470-4867-b631-e5beac75fd90\") " pod="openstack/nova-api-0" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.896309 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d10bd453-8470-4867-b631-e5beac75fd90-logs\") pod \"nova-api-0\" (UID: \"d10bd453-8470-4867-b631-e5beac75fd90\") " pod="openstack/nova-api-0" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.896364 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d10bd453-8470-4867-b631-e5beac75fd90-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d10bd453-8470-4867-b631-e5beac75fd90\") " pod="openstack/nova-api-0" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.896405 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mn7f\" (UniqueName: \"kubernetes.io/projected/d10bd453-8470-4867-b631-e5beac75fd90-kube-api-access-7mn7f\") pod \"nova-api-0\" (UID: \"d10bd453-8470-4867-b631-e5beac75fd90\") " pod="openstack/nova-api-0" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.897644 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d10bd453-8470-4867-b631-e5beac75fd90-logs\") pod \"nova-api-0\" (UID: \"d10bd453-8470-4867-b631-e5beac75fd90\") " pod="openstack/nova-api-0" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.900866 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d10bd453-8470-4867-b631-e5beac75fd90-config-data\") pod \"nova-api-0\" (UID: \"d10bd453-8470-4867-b631-e5beac75fd90\") " pod="openstack/nova-api-0" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.901411 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d10bd453-8470-4867-b631-e5beac75fd90-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d10bd453-8470-4867-b631-e5beac75fd90\") " pod="openstack/nova-api-0" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.914368 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mn7f\" (UniqueName: \"kubernetes.io/projected/d10bd453-8470-4867-b631-e5beac75fd90-kube-api-access-7mn7f\") pod \"nova-api-0\" (UID: \"d10bd453-8470-4867-b631-e5beac75fd90\") " pod="openstack/nova-api-0" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.973598 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 09:23:56 crc kubenswrapper[4848]: I0217 09:23:56.974396 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 09:23:57 crc kubenswrapper[4848]: I0217 09:23:57.080563 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 09:23:57 crc kubenswrapper[4848]: W0217 09:23:57.366700 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd10bd453_8470_4867_b631_e5beac75fd90.slice/crio-75db05ae649c9d99341f2916d022a383bba0636f13468dc0157d651c51f66a71 WatchSource:0}: Error finding container 75db05ae649c9d99341f2916d022a383bba0636f13468dc0157d651c51f66a71: Status 404 returned error can't find the container with id 75db05ae649c9d99341f2916d022a383bba0636f13468dc0157d651c51f66a71 Feb 17 09:23:57 crc kubenswrapper[4848]: I0217 09:23:57.370804 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 09:23:57 crc kubenswrapper[4848]: I0217 09:23:57.424985 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e104ad7-09b3-4881-857c-0384ea158171" path="/var/lib/kubelet/pods/0e104ad7-09b3-4881-857c-0384ea158171/volumes" Feb 17 09:23:57 crc kubenswrapper[4848]: I0217 09:23:57.425636 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22f73908-5ca1-44fb-ad87-1f1b709284bf" path="/var/lib/kubelet/pods/22f73908-5ca1-44fb-ad87-1f1b709284bf/volumes" Feb 17 09:23:57 crc kubenswrapper[4848]: I0217 09:23:57.672057 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d10bd453-8470-4867-b631-e5beac75fd90","Type":"ContainerStarted","Data":"203c65eab09ab2b0f28075f95603cca6017bac03eeaca2e9b511eba19f4f07f1"} Feb 17 09:23:57 crc kubenswrapper[4848]: I0217 09:23:57.672423 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d10bd453-8470-4867-b631-e5beac75fd90","Type":"ContainerStarted","Data":"75db05ae649c9d99341f2916d022a383bba0636f13468dc0157d651c51f66a71"} Feb 17 09:23:57 crc kubenswrapper[4848]: I0217 09:23:57.676536 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2f1b1861-5e43-493d-a3de-9686bac8fb14","Type":"ContainerStarted","Data":"364bf0e2ffc5335bddc0ff70dade4dc242ace3db5f8935f044cf8e39e5b10cbb"} Feb 17 09:23:57 crc kubenswrapper[4848]: I0217 09:23:57.702641 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.7026238989999998 podStartE2EDuration="2.702623899s" podCreationTimestamp="2026-02-17 09:23:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:23:57.695392354 +0000 UTC m=+1115.238648010" watchObservedRunningTime="2026-02-17 09:23:57.702623899 +0000 UTC m=+1115.245879545" Feb 17 09:23:58 crc kubenswrapper[4848]: I0217 09:23:58.689413 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d10bd453-8470-4867-b631-e5beac75fd90","Type":"ContainerStarted","Data":"8eb2f53ca3bb2e2b5292e00d1fc1e7bbf9de9dc1cd68833823d078fe3d71f9de"} Feb 17 09:23:58 crc kubenswrapper[4848]: I0217 09:23:58.714743 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.714726696 podStartE2EDuration="2.714726696s" podCreationTimestamp="2026-02-17 09:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:23:58.71040532 +0000 UTC m=+1116.253660976" watchObservedRunningTime="2026-02-17 09:23:58.714726696 +0000 UTC m=+1116.257982342" Feb 17 09:23:59 crc kubenswrapper[4848]: I0217 09:23:59.903849 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 09:23:59 crc kubenswrapper[4848]: I0217 09:23:59.904078 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="7c3fada3-5297-4ddf-ada7-fe24f8a9b17f" containerName="kube-state-metrics" containerID="cri-o://ee17a2483d756e7df0546e9312b07996ed341cd5cc9dff67706090878579b4f7" gracePeriod=30 Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.384996 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.469739 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx8mr\" (UniqueName: \"kubernetes.io/projected/7c3fada3-5297-4ddf-ada7-fe24f8a9b17f-kube-api-access-rx8mr\") pod \"7c3fada3-5297-4ddf-ada7-fe24f8a9b17f\" (UID: \"7c3fada3-5297-4ddf-ada7-fe24f8a9b17f\") " Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.475329 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c3fada3-5297-4ddf-ada7-fe24f8a9b17f-kube-api-access-rx8mr" (OuterVolumeSpecName: "kube-api-access-rx8mr") pod "7c3fada3-5297-4ddf-ada7-fe24f8a9b17f" (UID: "7c3fada3-5297-4ddf-ada7-fe24f8a9b17f"). InnerVolumeSpecName "kube-api-access-rx8mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.572209 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx8mr\" (UniqueName: \"kubernetes.io/projected/7c3fada3-5297-4ddf-ada7-fe24f8a9b17f-kube-api-access-rx8mr\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.710298 4848 generic.go:334] "Generic (PLEG): container finished" podID="7c3fada3-5297-4ddf-ada7-fe24f8a9b17f" containerID="ee17a2483d756e7df0546e9312b07996ed341cd5cc9dff67706090878579b4f7" exitCode=2 Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.710332 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7c3fada3-5297-4ddf-ada7-fe24f8a9b17f","Type":"ContainerDied","Data":"ee17a2483d756e7df0546e9312b07996ed341cd5cc9dff67706090878579b4f7"} Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.710370 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7c3fada3-5297-4ddf-ada7-fe24f8a9b17f","Type":"ContainerDied","Data":"94b1a33d43ba1ca534e518c6355ec088dc2b1e175067667569de4ce16f882fef"} Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.710377 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.710387 4848 scope.go:117] "RemoveContainer" containerID="ee17a2483d756e7df0546e9312b07996ed341cd5cc9dff67706090878579b4f7" Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.740403 4848 scope.go:117] "RemoveContainer" containerID="ee17a2483d756e7df0546e9312b07996ed341cd5cc9dff67706090878579b4f7" Feb 17 09:24:00 crc kubenswrapper[4848]: E0217 09:24:00.742023 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee17a2483d756e7df0546e9312b07996ed341cd5cc9dff67706090878579b4f7\": container with ID starting with ee17a2483d756e7df0546e9312b07996ed341cd5cc9dff67706090878579b4f7 not found: ID does not exist" containerID="ee17a2483d756e7df0546e9312b07996ed341cd5cc9dff67706090878579b4f7" Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.742184 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee17a2483d756e7df0546e9312b07996ed341cd5cc9dff67706090878579b4f7"} err="failed to get container status \"ee17a2483d756e7df0546e9312b07996ed341cd5cc9dff67706090878579b4f7\": rpc error: code = NotFound desc = could not find container \"ee17a2483d756e7df0546e9312b07996ed341cd5cc9dff67706090878579b4f7\": container with ID starting with ee17a2483d756e7df0546e9312b07996ed341cd5cc9dff67706090878579b4f7 not found: ID does not exist" Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.757725 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.769641 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.787714 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 09:24:00 crc kubenswrapper[4848]: E0217 09:24:00.788427 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c3fada3-5297-4ddf-ada7-fe24f8a9b17f" containerName="kube-state-metrics" Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.788455 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c3fada3-5297-4ddf-ada7-fe24f8a9b17f" containerName="kube-state-metrics" Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.788845 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c3fada3-5297-4ddf-ada7-fe24f8a9b17f" containerName="kube-state-metrics" Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.789888 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.792665 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.793258 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.817910 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.890644 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c3da4a-6e24-4b18-8c11-26d2255aebcc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"10c3da4a-6e24-4b18-8c11-26d2255aebcc\") " pod="openstack/kube-state-metrics-0" Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.890683 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mj9r\" (UniqueName: \"kubernetes.io/projected/10c3da4a-6e24-4b18-8c11-26d2255aebcc-kube-api-access-9mj9r\") pod \"kube-state-metrics-0\" (UID: \"10c3da4a-6e24-4b18-8c11-26d2255aebcc\") " pod="openstack/kube-state-metrics-0" Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.890794 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c3da4a-6e24-4b18-8c11-26d2255aebcc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"10c3da4a-6e24-4b18-8c11-26d2255aebcc\") " pod="openstack/kube-state-metrics-0" Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.890816 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/10c3da4a-6e24-4b18-8c11-26d2255aebcc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"10c3da4a-6e24-4b18-8c11-26d2255aebcc\") " pod="openstack/kube-state-metrics-0" Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.995189 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c3da4a-6e24-4b18-8c11-26d2255aebcc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"10c3da4a-6e24-4b18-8c11-26d2255aebcc\") " pod="openstack/kube-state-metrics-0" Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.995238 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mj9r\" (UniqueName: \"kubernetes.io/projected/10c3da4a-6e24-4b18-8c11-26d2255aebcc-kube-api-access-9mj9r\") pod \"kube-state-metrics-0\" (UID: \"10c3da4a-6e24-4b18-8c11-26d2255aebcc\") " pod="openstack/kube-state-metrics-0" Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.995339 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c3da4a-6e24-4b18-8c11-26d2255aebcc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"10c3da4a-6e24-4b18-8c11-26d2255aebcc\") " pod="openstack/kube-state-metrics-0" Feb 17 09:24:00 crc kubenswrapper[4848]: I0217 09:24:00.995364 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/10c3da4a-6e24-4b18-8c11-26d2255aebcc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"10c3da4a-6e24-4b18-8c11-26d2255aebcc\") " pod="openstack/kube-state-metrics-0" Feb 17 09:24:01 crc kubenswrapper[4848]: I0217 09:24:01.000891 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10c3da4a-6e24-4b18-8c11-26d2255aebcc-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"10c3da4a-6e24-4b18-8c11-26d2255aebcc\") " pod="openstack/kube-state-metrics-0" Feb 17 09:24:01 crc kubenswrapper[4848]: I0217 09:24:01.000929 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/10c3da4a-6e24-4b18-8c11-26d2255aebcc-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"10c3da4a-6e24-4b18-8c11-26d2255aebcc\") " pod="openstack/kube-state-metrics-0" Feb 17 09:24:01 crc kubenswrapper[4848]: I0217 09:24:01.003242 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/10c3da4a-6e24-4b18-8c11-26d2255aebcc-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"10c3da4a-6e24-4b18-8c11-26d2255aebcc\") " pod="openstack/kube-state-metrics-0" Feb 17 09:24:01 crc kubenswrapper[4848]: I0217 09:24:01.017614 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mj9r\" (UniqueName: \"kubernetes.io/projected/10c3da4a-6e24-4b18-8c11-26d2255aebcc-kube-api-access-9mj9r\") pod \"kube-state-metrics-0\" (UID: \"10c3da4a-6e24-4b18-8c11-26d2255aebcc\") " pod="openstack/kube-state-metrics-0" Feb 17 09:24:01 crc kubenswrapper[4848]: I0217 09:24:01.024035 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 09:24:01 crc kubenswrapper[4848]: I0217 09:24:01.113757 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 17 09:24:01 crc kubenswrapper[4848]: I0217 09:24:01.392644 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c3fada3-5297-4ddf-ada7-fe24f8a9b17f" path="/var/lib/kubelet/pods/7c3fada3-5297-4ddf-ada7-fe24f8a9b17f/volumes" Feb 17 09:24:01 crc kubenswrapper[4848]: I0217 09:24:01.547548 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 17 09:24:01 crc kubenswrapper[4848]: I0217 09:24:01.741060 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"10c3da4a-6e24-4b18-8c11-26d2255aebcc","Type":"ContainerStarted","Data":"a909a646383becf23d803269c20a2369d331fff6b81b750cce53697d30d3e7fd"} Feb 17 09:24:01 crc kubenswrapper[4848]: I0217 09:24:01.741468 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:24:01 crc kubenswrapper[4848]: I0217 09:24:01.741810 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b9561f05-0df5-433a-ba20-050c453688cf" containerName="ceilometer-central-agent" containerID="cri-o://d04f04f59fe19a6cb9d94967b2aa062774baf755ba75905812ac8aeea6330e49" gracePeriod=30 Feb 17 09:24:01 crc kubenswrapper[4848]: I0217 09:24:01.742308 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b9561f05-0df5-433a-ba20-050c453688cf" containerName="proxy-httpd" containerID="cri-o://2cfcb605b880a51361168614dac8aa19462a94fd08f2a65b21c3cf7f30393565" gracePeriod=30 Feb 17 09:24:01 crc kubenswrapper[4848]: I0217 09:24:01.742375 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b9561f05-0df5-433a-ba20-050c453688cf" containerName="sg-core" containerID="cri-o://43d0e7b153ad0126ffeead7250c7f7285d20202caf5e85afe967af2b92a79eb4" gracePeriod=30 Feb 17 09:24:01 crc kubenswrapper[4848]: I0217 09:24:01.742438 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b9561f05-0df5-433a-ba20-050c453688cf" containerName="ceilometer-notification-agent" containerID="cri-o://44c92dc0f2b9efa669a9bc114c5761ece1690ad5eec95fa7446f14e7998dcbb0" gracePeriod=30 Feb 17 09:24:01 crc kubenswrapper[4848]: I0217 09:24:01.912636 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 17 09:24:01 crc kubenswrapper[4848]: I0217 09:24:01.982904 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 09:24:01 crc kubenswrapper[4848]: I0217 09:24:01.982936 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 09:24:02 crc kubenswrapper[4848]: I0217 09:24:02.756340 4848 generic.go:334] "Generic (PLEG): container finished" podID="b9561f05-0df5-433a-ba20-050c453688cf" containerID="2cfcb605b880a51361168614dac8aa19462a94fd08f2a65b21c3cf7f30393565" exitCode=0 Feb 17 09:24:02 crc kubenswrapper[4848]: I0217 09:24:02.756746 4848 generic.go:334] "Generic (PLEG): container finished" podID="b9561f05-0df5-433a-ba20-050c453688cf" containerID="43d0e7b153ad0126ffeead7250c7f7285d20202caf5e85afe967af2b92a79eb4" exitCode=2 Feb 17 09:24:02 crc kubenswrapper[4848]: I0217 09:24:02.756784 4848 generic.go:334] "Generic (PLEG): container finished" podID="b9561f05-0df5-433a-ba20-050c453688cf" containerID="d04f04f59fe19a6cb9d94967b2aa062774baf755ba75905812ac8aeea6330e49" exitCode=0 Feb 17 09:24:02 crc kubenswrapper[4848]: I0217 09:24:02.756599 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b9561f05-0df5-433a-ba20-050c453688cf","Type":"ContainerDied","Data":"2cfcb605b880a51361168614dac8aa19462a94fd08f2a65b21c3cf7f30393565"} Feb 17 09:24:02 crc kubenswrapper[4848]: I0217 09:24:02.756871 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b9561f05-0df5-433a-ba20-050c453688cf","Type":"ContainerDied","Data":"43d0e7b153ad0126ffeead7250c7f7285d20202caf5e85afe967af2b92a79eb4"} Feb 17 09:24:02 crc kubenswrapper[4848]: I0217 09:24:02.756890 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b9561f05-0df5-433a-ba20-050c453688cf","Type":"ContainerDied","Data":"d04f04f59fe19a6cb9d94967b2aa062774baf755ba75905812ac8aeea6330e49"} Feb 17 09:24:02 crc kubenswrapper[4848]: I0217 09:24:02.759699 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"10c3da4a-6e24-4b18-8c11-26d2255aebcc","Type":"ContainerStarted","Data":"e395aec60edb6313ca1753e6950c4b4fcce8f26d62452f2d78128bb93be84c63"} Feb 17 09:24:02 crc kubenswrapper[4848]: I0217 09:24:02.759926 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 17 09:24:02 crc kubenswrapper[4848]: I0217 09:24:02.771906 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.389312931 podStartE2EDuration="2.771885901s" podCreationTimestamp="2026-02-17 09:24:00 +0000 UTC" firstStartedPulling="2026-02-17 09:24:01.550898174 +0000 UTC m=+1119.094153820" lastFinishedPulling="2026-02-17 09:24:01.933471154 +0000 UTC m=+1119.476726790" observedRunningTime="2026-02-17 09:24:02.771255384 +0000 UTC m=+1120.314511050" watchObservedRunningTime="2026-02-17 09:24:02.771885901 +0000 UTC m=+1120.315141557" Feb 17 09:24:02 crc kubenswrapper[4848]: I0217 09:24:02.999024 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="71d697a6-85b3-4801-8521-c72a378cbed0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 09:24:02 crc kubenswrapper[4848]: I0217 09:24:02.999361 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="71d697a6-85b3-4801-8521-c72a378cbed0" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.194938 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.253378 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9561f05-0df5-433a-ba20-050c453688cf-combined-ca-bundle\") pod \"b9561f05-0df5-433a-ba20-050c453688cf\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.253519 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9561f05-0df5-433a-ba20-050c453688cf-scripts\") pod \"b9561f05-0df5-433a-ba20-050c453688cf\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.253601 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9561f05-0df5-433a-ba20-050c453688cf-log-httpd\") pod \"b9561f05-0df5-433a-ba20-050c453688cf\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.253703 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b9561f05-0df5-433a-ba20-050c453688cf-sg-core-conf-yaml\") pod \"b9561f05-0df5-433a-ba20-050c453688cf\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.253792 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nf84l\" (UniqueName: \"kubernetes.io/projected/b9561f05-0df5-433a-ba20-050c453688cf-kube-api-access-nf84l\") pod \"b9561f05-0df5-433a-ba20-050c453688cf\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.253835 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9561f05-0df5-433a-ba20-050c453688cf-run-httpd\") pod \"b9561f05-0df5-433a-ba20-050c453688cf\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.253876 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9561f05-0df5-433a-ba20-050c453688cf-config-data\") pod \"b9561f05-0df5-433a-ba20-050c453688cf\" (UID: \"b9561f05-0df5-433a-ba20-050c453688cf\") " Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.254184 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9561f05-0df5-433a-ba20-050c453688cf-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b9561f05-0df5-433a-ba20-050c453688cf" (UID: "b9561f05-0df5-433a-ba20-050c453688cf"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.254241 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9561f05-0df5-433a-ba20-050c453688cf-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b9561f05-0df5-433a-ba20-050c453688cf" (UID: "b9561f05-0df5-433a-ba20-050c453688cf"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.254743 4848 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9561f05-0df5-433a-ba20-050c453688cf-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.254785 4848 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b9561f05-0df5-433a-ba20-050c453688cf-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.259361 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9561f05-0df5-433a-ba20-050c453688cf-kube-api-access-nf84l" (OuterVolumeSpecName: "kube-api-access-nf84l") pod "b9561f05-0df5-433a-ba20-050c453688cf" (UID: "b9561f05-0df5-433a-ba20-050c453688cf"). InnerVolumeSpecName "kube-api-access-nf84l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.260672 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9561f05-0df5-433a-ba20-050c453688cf-scripts" (OuterVolumeSpecName: "scripts") pod "b9561f05-0df5-433a-ba20-050c453688cf" (UID: "b9561f05-0df5-433a-ba20-050c453688cf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.297681 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9561f05-0df5-433a-ba20-050c453688cf-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b9561f05-0df5-433a-ba20-050c453688cf" (UID: "b9561f05-0df5-433a-ba20-050c453688cf"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.362590 4848 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b9561f05-0df5-433a-ba20-050c453688cf-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.362873 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nf84l\" (UniqueName: \"kubernetes.io/projected/b9561f05-0df5-433a-ba20-050c453688cf-kube-api-access-nf84l\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.362886 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b9561f05-0df5-433a-ba20-050c453688cf-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.373041 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9561f05-0df5-433a-ba20-050c453688cf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9561f05-0df5-433a-ba20-050c453688cf" (UID: "b9561f05-0df5-433a-ba20-050c453688cf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.392748 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9561f05-0df5-433a-ba20-050c453688cf-config-data" (OuterVolumeSpecName: "config-data") pod "b9561f05-0df5-433a-ba20-050c453688cf" (UID: "b9561f05-0df5-433a-ba20-050c453688cf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.464539 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9561f05-0df5-433a-ba20-050c453688cf-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.464566 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9561f05-0df5-433a-ba20-050c453688cf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.795434 4848 generic.go:334] "Generic (PLEG): container finished" podID="b9561f05-0df5-433a-ba20-050c453688cf" containerID="44c92dc0f2b9efa669a9bc114c5761ece1690ad5eec95fa7446f14e7998dcbb0" exitCode=0 Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.795481 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b9561f05-0df5-433a-ba20-050c453688cf","Type":"ContainerDied","Data":"44c92dc0f2b9efa669a9bc114c5761ece1690ad5eec95fa7446f14e7998dcbb0"} Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.795510 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b9561f05-0df5-433a-ba20-050c453688cf","Type":"ContainerDied","Data":"c752e5d5211cbf7525f086dc9ae26057384173e77aa716a9bccc0c1594b2fef0"} Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.795616 4848 scope.go:117] "RemoveContainer" containerID="2cfcb605b880a51361168614dac8aa19462a94fd08f2a65b21c3cf7f30393565" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.795812 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.842533 4848 scope.go:117] "RemoveContainer" containerID="43d0e7b153ad0126ffeead7250c7f7285d20202caf5e85afe967af2b92a79eb4" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.845259 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.852258 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.879492 4848 scope.go:117] "RemoveContainer" containerID="44c92dc0f2b9efa669a9bc114c5761ece1690ad5eec95fa7446f14e7998dcbb0" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.885748 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:24:04 crc kubenswrapper[4848]: E0217 09:24:04.886103 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9561f05-0df5-433a-ba20-050c453688cf" containerName="ceilometer-notification-agent" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.886116 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9561f05-0df5-433a-ba20-050c453688cf" containerName="ceilometer-notification-agent" Feb 17 09:24:04 crc kubenswrapper[4848]: E0217 09:24:04.886134 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9561f05-0df5-433a-ba20-050c453688cf" containerName="ceilometer-central-agent" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.886140 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9561f05-0df5-433a-ba20-050c453688cf" containerName="ceilometer-central-agent" Feb 17 09:24:04 crc kubenswrapper[4848]: E0217 09:24:04.886156 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9561f05-0df5-433a-ba20-050c453688cf" containerName="proxy-httpd" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.886162 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9561f05-0df5-433a-ba20-050c453688cf" containerName="proxy-httpd" Feb 17 09:24:04 crc kubenswrapper[4848]: E0217 09:24:04.886169 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9561f05-0df5-433a-ba20-050c453688cf" containerName="sg-core" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.886175 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9561f05-0df5-433a-ba20-050c453688cf" containerName="sg-core" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.888527 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9561f05-0df5-433a-ba20-050c453688cf" containerName="ceilometer-central-agent" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.888552 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9561f05-0df5-433a-ba20-050c453688cf" containerName="sg-core" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.888563 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9561f05-0df5-433a-ba20-050c453688cf" containerName="ceilometer-notification-agent" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.888576 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9561f05-0df5-433a-ba20-050c453688cf" containerName="proxy-httpd" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.890572 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.893259 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.893502 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.893829 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.912665 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.967608 4848 scope.go:117] "RemoveContainer" containerID="d04f04f59fe19a6cb9d94967b2aa062774baf755ba75905812ac8aeea6330e49" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.972377 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-config-data\") pod \"ceilometer-0\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " pod="openstack/ceilometer-0" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.972420 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " pod="openstack/ceilometer-0" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.972549 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " pod="openstack/ceilometer-0" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.972626 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47322ca8-f0b0-4d41-95e6-87a0dadb398f-log-httpd\") pod \"ceilometer-0\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " pod="openstack/ceilometer-0" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.972660 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47322ca8-f0b0-4d41-95e6-87a0dadb398f-run-httpd\") pod \"ceilometer-0\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " pod="openstack/ceilometer-0" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.972752 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-scripts\") pod \"ceilometer-0\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " pod="openstack/ceilometer-0" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.972819 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8vwd\" (UniqueName: \"kubernetes.io/projected/47322ca8-f0b0-4d41-95e6-87a0dadb398f-kube-api-access-g8vwd\") pod \"ceilometer-0\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " pod="openstack/ceilometer-0" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.972900 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " pod="openstack/ceilometer-0" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.992987 4848 scope.go:117] "RemoveContainer" containerID="2cfcb605b880a51361168614dac8aa19462a94fd08f2a65b21c3cf7f30393565" Feb 17 09:24:04 crc kubenswrapper[4848]: E0217 09:24:04.993469 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cfcb605b880a51361168614dac8aa19462a94fd08f2a65b21c3cf7f30393565\": container with ID starting with 2cfcb605b880a51361168614dac8aa19462a94fd08f2a65b21c3cf7f30393565 not found: ID does not exist" containerID="2cfcb605b880a51361168614dac8aa19462a94fd08f2a65b21c3cf7f30393565" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.993521 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cfcb605b880a51361168614dac8aa19462a94fd08f2a65b21c3cf7f30393565"} err="failed to get container status \"2cfcb605b880a51361168614dac8aa19462a94fd08f2a65b21c3cf7f30393565\": rpc error: code = NotFound desc = could not find container \"2cfcb605b880a51361168614dac8aa19462a94fd08f2a65b21c3cf7f30393565\": container with ID starting with 2cfcb605b880a51361168614dac8aa19462a94fd08f2a65b21c3cf7f30393565 not found: ID does not exist" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.993633 4848 scope.go:117] "RemoveContainer" containerID="43d0e7b153ad0126ffeead7250c7f7285d20202caf5e85afe967af2b92a79eb4" Feb 17 09:24:04 crc kubenswrapper[4848]: E0217 09:24:04.993995 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43d0e7b153ad0126ffeead7250c7f7285d20202caf5e85afe967af2b92a79eb4\": container with ID starting with 43d0e7b153ad0126ffeead7250c7f7285d20202caf5e85afe967af2b92a79eb4 not found: ID does not exist" containerID="43d0e7b153ad0126ffeead7250c7f7285d20202caf5e85afe967af2b92a79eb4" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.994055 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d0e7b153ad0126ffeead7250c7f7285d20202caf5e85afe967af2b92a79eb4"} err="failed to get container status \"43d0e7b153ad0126ffeead7250c7f7285d20202caf5e85afe967af2b92a79eb4\": rpc error: code = NotFound desc = could not find container \"43d0e7b153ad0126ffeead7250c7f7285d20202caf5e85afe967af2b92a79eb4\": container with ID starting with 43d0e7b153ad0126ffeead7250c7f7285d20202caf5e85afe967af2b92a79eb4 not found: ID does not exist" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.994082 4848 scope.go:117] "RemoveContainer" containerID="44c92dc0f2b9efa669a9bc114c5761ece1690ad5eec95fa7446f14e7998dcbb0" Feb 17 09:24:04 crc kubenswrapper[4848]: E0217 09:24:04.994447 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44c92dc0f2b9efa669a9bc114c5761ece1690ad5eec95fa7446f14e7998dcbb0\": container with ID starting with 44c92dc0f2b9efa669a9bc114c5761ece1690ad5eec95fa7446f14e7998dcbb0 not found: ID does not exist" containerID="44c92dc0f2b9efa669a9bc114c5761ece1690ad5eec95fa7446f14e7998dcbb0" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.994491 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44c92dc0f2b9efa669a9bc114c5761ece1690ad5eec95fa7446f14e7998dcbb0"} err="failed to get container status \"44c92dc0f2b9efa669a9bc114c5761ece1690ad5eec95fa7446f14e7998dcbb0\": rpc error: code = NotFound desc = could not find container \"44c92dc0f2b9efa669a9bc114c5761ece1690ad5eec95fa7446f14e7998dcbb0\": container with ID starting with 44c92dc0f2b9efa669a9bc114c5761ece1690ad5eec95fa7446f14e7998dcbb0 not found: ID does not exist" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.994518 4848 scope.go:117] "RemoveContainer" containerID="d04f04f59fe19a6cb9d94967b2aa062774baf755ba75905812ac8aeea6330e49" Feb 17 09:24:04 crc kubenswrapper[4848]: E0217 09:24:04.995003 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d04f04f59fe19a6cb9d94967b2aa062774baf755ba75905812ac8aeea6330e49\": container with ID starting with d04f04f59fe19a6cb9d94967b2aa062774baf755ba75905812ac8aeea6330e49 not found: ID does not exist" containerID="d04f04f59fe19a6cb9d94967b2aa062774baf755ba75905812ac8aeea6330e49" Feb 17 09:24:04 crc kubenswrapper[4848]: I0217 09:24:04.995023 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d04f04f59fe19a6cb9d94967b2aa062774baf755ba75905812ac8aeea6330e49"} err="failed to get container status \"d04f04f59fe19a6cb9d94967b2aa062774baf755ba75905812ac8aeea6330e49\": rpc error: code = NotFound desc = could not find container \"d04f04f59fe19a6cb9d94967b2aa062774baf755ba75905812ac8aeea6330e49\": container with ID starting with d04f04f59fe19a6cb9d94967b2aa062774baf755ba75905812ac8aeea6330e49 not found: ID does not exist" Feb 17 09:24:05 crc kubenswrapper[4848]: I0217 09:24:05.074450 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " pod="openstack/ceilometer-0" Feb 17 09:24:05 crc kubenswrapper[4848]: I0217 09:24:05.074536 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-config-data\") pod \"ceilometer-0\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " pod="openstack/ceilometer-0" Feb 17 09:24:05 crc kubenswrapper[4848]: I0217 09:24:05.074569 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " pod="openstack/ceilometer-0" Feb 17 09:24:05 crc kubenswrapper[4848]: I0217 09:24:05.074638 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " pod="openstack/ceilometer-0" Feb 17 09:24:05 crc kubenswrapper[4848]: I0217 09:24:05.074677 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47322ca8-f0b0-4d41-95e6-87a0dadb398f-log-httpd\") pod \"ceilometer-0\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " pod="openstack/ceilometer-0" Feb 17 09:24:05 crc kubenswrapper[4848]: I0217 09:24:05.074703 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47322ca8-f0b0-4d41-95e6-87a0dadb398f-run-httpd\") pod \"ceilometer-0\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " pod="openstack/ceilometer-0" Feb 17 09:24:05 crc kubenswrapper[4848]: I0217 09:24:05.074787 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-scripts\") pod \"ceilometer-0\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " pod="openstack/ceilometer-0" Feb 17 09:24:05 crc kubenswrapper[4848]: I0217 09:24:05.074818 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8vwd\" (UniqueName: \"kubernetes.io/projected/47322ca8-f0b0-4d41-95e6-87a0dadb398f-kube-api-access-g8vwd\") pod \"ceilometer-0\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " pod="openstack/ceilometer-0" Feb 17 09:24:05 crc kubenswrapper[4848]: I0217 09:24:05.075588 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47322ca8-f0b0-4d41-95e6-87a0dadb398f-run-httpd\") pod \"ceilometer-0\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " pod="openstack/ceilometer-0" Feb 17 09:24:05 crc kubenswrapper[4848]: I0217 09:24:05.075654 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47322ca8-f0b0-4d41-95e6-87a0dadb398f-log-httpd\") pod \"ceilometer-0\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " pod="openstack/ceilometer-0" Feb 17 09:24:05 crc kubenswrapper[4848]: I0217 09:24:05.078681 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " pod="openstack/ceilometer-0" Feb 17 09:24:05 crc kubenswrapper[4848]: I0217 09:24:05.079170 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " pod="openstack/ceilometer-0" Feb 17 09:24:05 crc kubenswrapper[4848]: I0217 09:24:05.081462 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " pod="openstack/ceilometer-0" Feb 17 09:24:05 crc kubenswrapper[4848]: I0217 09:24:05.081628 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-scripts\") pod \"ceilometer-0\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " pod="openstack/ceilometer-0" Feb 17 09:24:05 crc kubenswrapper[4848]: I0217 09:24:05.084972 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-config-data\") pod \"ceilometer-0\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " pod="openstack/ceilometer-0" Feb 17 09:24:05 crc kubenswrapper[4848]: I0217 09:24:05.092702 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8vwd\" (UniqueName: \"kubernetes.io/projected/47322ca8-f0b0-4d41-95e6-87a0dadb398f-kube-api-access-g8vwd\") pod \"ceilometer-0\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " pod="openstack/ceilometer-0" Feb 17 09:24:05 crc kubenswrapper[4848]: I0217 09:24:05.260449 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:24:05 crc kubenswrapper[4848]: I0217 09:24:05.402902 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9561f05-0df5-433a-ba20-050c453688cf" path="/var/lib/kubelet/pods/b9561f05-0df5-433a-ba20-050c453688cf/volumes" Feb 17 09:24:05 crc kubenswrapper[4848]: W0217 09:24:05.728581 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47322ca8_f0b0_4d41_95e6_87a0dadb398f.slice/crio-eb3bbe13ff28bd063a333423906017f3f73242fa288c12a082c05e4c6f4e4b6e WatchSource:0}: Error finding container eb3bbe13ff28bd063a333423906017f3f73242fa288c12a082c05e4c6f4e4b6e: Status 404 returned error can't find the container with id eb3bbe13ff28bd063a333423906017f3f73242fa288c12a082c05e4c6f4e4b6e Feb 17 09:24:05 crc kubenswrapper[4848]: I0217 09:24:05.730718 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:24:05 crc kubenswrapper[4848]: I0217 09:24:05.806940 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47322ca8-f0b0-4d41-95e6-87a0dadb398f","Type":"ContainerStarted","Data":"eb3bbe13ff28bd063a333423906017f3f73242fa288c12a082c05e4c6f4e4b6e"} Feb 17 09:24:06 crc kubenswrapper[4848]: I0217 09:24:06.024960 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 09:24:06 crc kubenswrapper[4848]: I0217 09:24:06.070612 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 09:24:06 crc kubenswrapper[4848]: I0217 09:24:06.819238 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47322ca8-f0b0-4d41-95e6-87a0dadb398f","Type":"ContainerStarted","Data":"2e03f64fd10a82ed995e1e37db36ad0c9a0e0f0d04660af4b70f5de082d24a60"} Feb 17 09:24:06 crc kubenswrapper[4848]: I0217 09:24:06.851180 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 09:24:07 crc kubenswrapper[4848]: I0217 09:24:07.081512 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 09:24:07 crc kubenswrapper[4848]: I0217 09:24:07.082046 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 09:24:07 crc kubenswrapper[4848]: I0217 09:24:07.829661 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47322ca8-f0b0-4d41-95e6-87a0dadb398f","Type":"ContainerStarted","Data":"7bcb06962eb299e81b2b72d6fa76ff704752b056d7354b11a38d158c4c275085"} Feb 17 09:24:07 crc kubenswrapper[4848]: I0217 09:24:07.829701 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47322ca8-f0b0-4d41-95e6-87a0dadb398f","Type":"ContainerStarted","Data":"b02b0297b2b09f1a55e30656fbe3b9bbaf64381349bf32db409cec54b3941a01"} Feb 17 09:24:08 crc kubenswrapper[4848]: I0217 09:24:08.164944 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d10bd453-8470-4867-b631-e5beac75fd90" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 09:24:08 crc kubenswrapper[4848]: I0217 09:24:08.164992 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d10bd453-8470-4867-b631-e5beac75fd90" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.199:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 17 09:24:09 crc kubenswrapper[4848]: I0217 09:24:09.856625 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47322ca8-f0b0-4d41-95e6-87a0dadb398f","Type":"ContainerStarted","Data":"2bd4f43ebe118f33d899658bfe42d19da1a070ab8a7b54a2409ca7bd35b6c9e4"} Feb 17 09:24:09 crc kubenswrapper[4848]: I0217 09:24:09.858469 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 09:24:09 crc kubenswrapper[4848]: I0217 09:24:09.893319 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.8313825599999998 podStartE2EDuration="5.893296613s" podCreationTimestamp="2026-02-17 09:24:04 +0000 UTC" firstStartedPulling="2026-02-17 09:24:05.732718316 +0000 UTC m=+1123.275973962" lastFinishedPulling="2026-02-17 09:24:08.794632369 +0000 UTC m=+1126.337888015" observedRunningTime="2026-02-17 09:24:09.879558352 +0000 UTC m=+1127.422814038" watchObservedRunningTime="2026-02-17 09:24:09.893296613 +0000 UTC m=+1127.436552259" Feb 17 09:24:11 crc kubenswrapper[4848]: I0217 09:24:11.125198 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 17 09:24:11 crc kubenswrapper[4848]: I0217 09:24:11.979656 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 09:24:11 crc kubenswrapper[4848]: I0217 09:24:11.979804 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 09:24:11 crc kubenswrapper[4848]: I0217 09:24:11.985588 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 09:24:11 crc kubenswrapper[4848]: I0217 09:24:11.986103 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 09:24:14 crc kubenswrapper[4848]: I0217 09:24:14.790440 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:24:14 crc kubenswrapper[4848]: I0217 09:24:14.856696 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8mpq\" (UniqueName: \"kubernetes.io/projected/0a0a24b3-51ac-4726-ac00-f6d52f2b1bad-kube-api-access-s8mpq\") pod \"0a0a24b3-51ac-4726-ac00-f6d52f2b1bad\" (UID: \"0a0a24b3-51ac-4726-ac00-f6d52f2b1bad\") " Feb 17 09:24:14 crc kubenswrapper[4848]: I0217 09:24:14.856988 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0a24b3-51ac-4726-ac00-f6d52f2b1bad-combined-ca-bundle\") pod \"0a0a24b3-51ac-4726-ac00-f6d52f2b1bad\" (UID: \"0a0a24b3-51ac-4726-ac00-f6d52f2b1bad\") " Feb 17 09:24:14 crc kubenswrapper[4848]: I0217 09:24:14.857026 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a0a24b3-51ac-4726-ac00-f6d52f2b1bad-config-data\") pod \"0a0a24b3-51ac-4726-ac00-f6d52f2b1bad\" (UID: \"0a0a24b3-51ac-4726-ac00-f6d52f2b1bad\") " Feb 17 09:24:14 crc kubenswrapper[4848]: I0217 09:24:14.862222 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a0a24b3-51ac-4726-ac00-f6d52f2b1bad-kube-api-access-s8mpq" (OuterVolumeSpecName: "kube-api-access-s8mpq") pod "0a0a24b3-51ac-4726-ac00-f6d52f2b1bad" (UID: "0a0a24b3-51ac-4726-ac00-f6d52f2b1bad"). InnerVolumeSpecName "kube-api-access-s8mpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:24:14 crc kubenswrapper[4848]: I0217 09:24:14.897390 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a0a24b3-51ac-4726-ac00-f6d52f2b1bad-config-data" (OuterVolumeSpecName: "config-data") pod "0a0a24b3-51ac-4726-ac00-f6d52f2b1bad" (UID: "0a0a24b3-51ac-4726-ac00-f6d52f2b1bad"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:24:14 crc kubenswrapper[4848]: I0217 09:24:14.905734 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0a0a24b3-51ac-4726-ac00-f6d52f2b1bad-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0a0a24b3-51ac-4726-ac00-f6d52f2b1bad" (UID: "0a0a24b3-51ac-4726-ac00-f6d52f2b1bad"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:24:14 crc kubenswrapper[4848]: I0217 09:24:14.909659 4848 generic.go:334] "Generic (PLEG): container finished" podID="0a0a24b3-51ac-4726-ac00-f6d52f2b1bad" containerID="e901a001e8a4d54934842433f03798ea569c729c276d62fbb2e833a436656469" exitCode=137 Feb 17 09:24:14 crc kubenswrapper[4848]: I0217 09:24:14.909706 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0a0a24b3-51ac-4726-ac00-f6d52f2b1bad","Type":"ContainerDied","Data":"e901a001e8a4d54934842433f03798ea569c729c276d62fbb2e833a436656469"} Feb 17 09:24:14 crc kubenswrapper[4848]: I0217 09:24:14.909737 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0a0a24b3-51ac-4726-ac00-f6d52f2b1bad","Type":"ContainerDied","Data":"d6a047b039309da3b1aa628f04b650c5eebc65a4baf6288e5982ee90da896f26"} Feb 17 09:24:14 crc kubenswrapper[4848]: I0217 09:24:14.909772 4848 scope.go:117] "RemoveContainer" containerID="e901a001e8a4d54934842433f03798ea569c729c276d62fbb2e833a436656469" Feb 17 09:24:14 crc kubenswrapper[4848]: I0217 09:24:14.909912 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:24:14 crc kubenswrapper[4848]: I0217 09:24:14.959131 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a0a24b3-51ac-4726-ac00-f6d52f2b1bad-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:14 crc kubenswrapper[4848]: I0217 09:24:14.959179 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a0a24b3-51ac-4726-ac00-f6d52f2b1bad-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:14 crc kubenswrapper[4848]: I0217 09:24:14.959198 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8mpq\" (UniqueName: \"kubernetes.io/projected/0a0a24b3-51ac-4726-ac00-f6d52f2b1bad-kube-api-access-s8mpq\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:14 crc kubenswrapper[4848]: I0217 09:24:14.988504 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.004686 4848 scope.go:117] "RemoveContainer" containerID="e901a001e8a4d54934842433f03798ea569c729c276d62fbb2e833a436656469" Feb 17 09:24:15 crc kubenswrapper[4848]: E0217 09:24:15.005843 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e901a001e8a4d54934842433f03798ea569c729c276d62fbb2e833a436656469\": container with ID starting with e901a001e8a4d54934842433f03798ea569c729c276d62fbb2e833a436656469 not found: ID does not exist" containerID="e901a001e8a4d54934842433f03798ea569c729c276d62fbb2e833a436656469" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.005881 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e901a001e8a4d54934842433f03798ea569c729c276d62fbb2e833a436656469"} err="failed to get container status \"e901a001e8a4d54934842433f03798ea569c729c276d62fbb2e833a436656469\": rpc error: code = NotFound desc = could not find container \"e901a001e8a4d54934842433f03798ea569c729c276d62fbb2e833a436656469\": container with ID starting with e901a001e8a4d54934842433f03798ea569c729c276d62fbb2e833a436656469 not found: ID does not exist" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.007978 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.029955 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 09:24:15 crc kubenswrapper[4848]: E0217 09:24:15.030403 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a0a24b3-51ac-4726-ac00-f6d52f2b1bad" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.030419 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a0a24b3-51ac-4726-ac00-f6d52f2b1bad" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.030612 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a0a24b3-51ac-4726-ac00-f6d52f2b1bad" containerName="nova-cell1-novncproxy-novncproxy" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.032786 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.034947 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.035185 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.035333 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.040578 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.162358 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b2f03bc-3c70-4dc8-9478-5474155fdf90-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b2f03bc-3c70-4dc8-9478-5474155fdf90\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.162415 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b2f03bc-3c70-4dc8-9478-5474155fdf90-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b2f03bc-3c70-4dc8-9478-5474155fdf90\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.162448 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d48ls\" (UniqueName: \"kubernetes.io/projected/0b2f03bc-3c70-4dc8-9478-5474155fdf90-kube-api-access-d48ls\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b2f03bc-3c70-4dc8-9478-5474155fdf90\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.162550 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b2f03bc-3c70-4dc8-9478-5474155fdf90-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b2f03bc-3c70-4dc8-9478-5474155fdf90\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.162577 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b2f03bc-3c70-4dc8-9478-5474155fdf90-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b2f03bc-3c70-4dc8-9478-5474155fdf90\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.264121 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b2f03bc-3c70-4dc8-9478-5474155fdf90-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b2f03bc-3c70-4dc8-9478-5474155fdf90\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.264181 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b2f03bc-3c70-4dc8-9478-5474155fdf90-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b2f03bc-3c70-4dc8-9478-5474155fdf90\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.264218 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d48ls\" (UniqueName: \"kubernetes.io/projected/0b2f03bc-3c70-4dc8-9478-5474155fdf90-kube-api-access-d48ls\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b2f03bc-3c70-4dc8-9478-5474155fdf90\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.264307 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b2f03bc-3c70-4dc8-9478-5474155fdf90-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b2f03bc-3c70-4dc8-9478-5474155fdf90\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.264330 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b2f03bc-3c70-4dc8-9478-5474155fdf90-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b2f03bc-3c70-4dc8-9478-5474155fdf90\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.268661 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b2f03bc-3c70-4dc8-9478-5474155fdf90-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b2f03bc-3c70-4dc8-9478-5474155fdf90\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.269085 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0b2f03bc-3c70-4dc8-9478-5474155fdf90-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b2f03bc-3c70-4dc8-9478-5474155fdf90\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.270543 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/0b2f03bc-3c70-4dc8-9478-5474155fdf90-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b2f03bc-3c70-4dc8-9478-5474155fdf90\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.272644 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0b2f03bc-3c70-4dc8-9478-5474155fdf90-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b2f03bc-3c70-4dc8-9478-5474155fdf90\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.280365 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d48ls\" (UniqueName: \"kubernetes.io/projected/0b2f03bc-3c70-4dc8-9478-5474155fdf90-kube-api-access-d48ls\") pod \"nova-cell1-novncproxy-0\" (UID: \"0b2f03bc-3c70-4dc8-9478-5474155fdf90\") " pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.352472 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.399338 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a0a24b3-51ac-4726-ac00-f6d52f2b1bad" path="/var/lib/kubelet/pods/0a0a24b3-51ac-4726-ac00-f6d52f2b1bad/volumes" Feb 17 09:24:15 crc kubenswrapper[4848]: I0217 09:24:15.846673 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 17 09:24:16 crc kubenswrapper[4848]: I0217 09:24:16.932956 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0b2f03bc-3c70-4dc8-9478-5474155fdf90","Type":"ContainerStarted","Data":"f9f7ecf8f4051861a3e822007be1b74d44346d303eb196d76c6b9d90536a2c6d"} Feb 17 09:24:16 crc kubenswrapper[4848]: I0217 09:24:16.933342 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"0b2f03bc-3c70-4dc8-9478-5474155fdf90","Type":"ContainerStarted","Data":"51394da98d835774f6278a0a0dfc4d9a86ee9793fee53ffdeb9bf280c31973f5"} Feb 17 09:24:16 crc kubenswrapper[4848]: I0217 09:24:16.962909 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.962881185 podStartE2EDuration="2.962881185s" podCreationTimestamp="2026-02-17 09:24:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:24:16.961478977 +0000 UTC m=+1134.504734623" watchObservedRunningTime="2026-02-17 09:24:16.962881185 +0000 UTC m=+1134.506136871" Feb 17 09:24:17 crc kubenswrapper[4848]: I0217 09:24:17.087990 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 09:24:17 crc kubenswrapper[4848]: I0217 09:24:17.089604 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 09:24:17 crc kubenswrapper[4848]: I0217 09:24:17.090296 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 09:24:17 crc kubenswrapper[4848]: I0217 09:24:17.100422 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 09:24:17 crc kubenswrapper[4848]: I0217 09:24:17.945348 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 09:24:17 crc kubenswrapper[4848]: I0217 09:24:17.952128 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 09:24:18 crc kubenswrapper[4848]: I0217 09:24:18.179115 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7dcd758995-gt9wf"] Feb 17 09:24:18 crc kubenswrapper[4848]: I0217 09:24:18.185604 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" Feb 17 09:24:18 crc kubenswrapper[4848]: I0217 09:24:18.205000 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7dcd758995-gt9wf"] Feb 17 09:24:18 crc kubenswrapper[4848]: I0217 09:24:18.224858 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-ovsdbserver-nb\") pod \"dnsmasq-dns-7dcd758995-gt9wf\" (UID: \"87968358-75c5-4fb2-b3b1-8cf4e806611d\") " pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" Feb 17 09:24:18 crc kubenswrapper[4848]: I0217 09:24:18.224901 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-config\") pod \"dnsmasq-dns-7dcd758995-gt9wf\" (UID: \"87968358-75c5-4fb2-b3b1-8cf4e806611d\") " pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" Feb 17 09:24:18 crc kubenswrapper[4848]: I0217 09:24:18.224931 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-dns-svc\") pod \"dnsmasq-dns-7dcd758995-gt9wf\" (UID: \"87968358-75c5-4fb2-b3b1-8cf4e806611d\") " pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" Feb 17 09:24:18 crc kubenswrapper[4848]: I0217 09:24:18.225028 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-ovsdbserver-sb\") pod \"dnsmasq-dns-7dcd758995-gt9wf\" (UID: \"87968358-75c5-4fb2-b3b1-8cf4e806611d\") " pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" Feb 17 09:24:18 crc kubenswrapper[4848]: I0217 09:24:18.225130 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9drct\" (UniqueName: \"kubernetes.io/projected/87968358-75c5-4fb2-b3b1-8cf4e806611d-kube-api-access-9drct\") pod \"dnsmasq-dns-7dcd758995-gt9wf\" (UID: \"87968358-75c5-4fb2-b3b1-8cf4e806611d\") " pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" Feb 17 09:24:18 crc kubenswrapper[4848]: I0217 09:24:18.225182 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-dns-swift-storage-0\") pod \"dnsmasq-dns-7dcd758995-gt9wf\" (UID: \"87968358-75c5-4fb2-b3b1-8cf4e806611d\") " pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" Feb 17 09:24:18 crc kubenswrapper[4848]: I0217 09:24:18.326973 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-ovsdbserver-sb\") pod \"dnsmasq-dns-7dcd758995-gt9wf\" (UID: \"87968358-75c5-4fb2-b3b1-8cf4e806611d\") " pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" Feb 17 09:24:18 crc kubenswrapper[4848]: I0217 09:24:18.327112 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9drct\" (UniqueName: \"kubernetes.io/projected/87968358-75c5-4fb2-b3b1-8cf4e806611d-kube-api-access-9drct\") pod \"dnsmasq-dns-7dcd758995-gt9wf\" (UID: \"87968358-75c5-4fb2-b3b1-8cf4e806611d\") " pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" Feb 17 09:24:18 crc kubenswrapper[4848]: I0217 09:24:18.327158 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-dns-swift-storage-0\") pod \"dnsmasq-dns-7dcd758995-gt9wf\" (UID: \"87968358-75c5-4fb2-b3b1-8cf4e806611d\") " pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" Feb 17 09:24:18 crc kubenswrapper[4848]: I0217 09:24:18.327207 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-ovsdbserver-nb\") pod \"dnsmasq-dns-7dcd758995-gt9wf\" (UID: \"87968358-75c5-4fb2-b3b1-8cf4e806611d\") " pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" Feb 17 09:24:18 crc kubenswrapper[4848]: I0217 09:24:18.327227 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-config\") pod \"dnsmasq-dns-7dcd758995-gt9wf\" (UID: \"87968358-75c5-4fb2-b3b1-8cf4e806611d\") " pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" Feb 17 09:24:18 crc kubenswrapper[4848]: I0217 09:24:18.327247 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-dns-svc\") pod \"dnsmasq-dns-7dcd758995-gt9wf\" (UID: \"87968358-75c5-4fb2-b3b1-8cf4e806611d\") " pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" Feb 17 09:24:18 crc kubenswrapper[4848]: I0217 09:24:18.328677 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-ovsdbserver-sb\") pod \"dnsmasq-dns-7dcd758995-gt9wf\" (UID: \"87968358-75c5-4fb2-b3b1-8cf4e806611d\") " pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" Feb 17 09:24:18 crc kubenswrapper[4848]: I0217 09:24:18.329344 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-ovsdbserver-nb\") pod \"dnsmasq-dns-7dcd758995-gt9wf\" (UID: \"87968358-75c5-4fb2-b3b1-8cf4e806611d\") " pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" Feb 17 09:24:18 crc kubenswrapper[4848]: I0217 09:24:18.330783 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-config\") pod \"dnsmasq-dns-7dcd758995-gt9wf\" (UID: \"87968358-75c5-4fb2-b3b1-8cf4e806611d\") " pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" Feb 17 09:24:18 crc kubenswrapper[4848]: I0217 09:24:18.331625 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-dns-swift-storage-0\") pod \"dnsmasq-dns-7dcd758995-gt9wf\" (UID: \"87968358-75c5-4fb2-b3b1-8cf4e806611d\") " pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" Feb 17 09:24:18 crc kubenswrapper[4848]: I0217 09:24:18.331664 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-dns-svc\") pod \"dnsmasq-dns-7dcd758995-gt9wf\" (UID: \"87968358-75c5-4fb2-b3b1-8cf4e806611d\") " pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" Feb 17 09:24:18 crc kubenswrapper[4848]: I0217 09:24:18.366492 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9drct\" (UniqueName: \"kubernetes.io/projected/87968358-75c5-4fb2-b3b1-8cf4e806611d-kube-api-access-9drct\") pod \"dnsmasq-dns-7dcd758995-gt9wf\" (UID: \"87968358-75c5-4fb2-b3b1-8cf4e806611d\") " pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" Feb 17 09:24:18 crc kubenswrapper[4848]: I0217 09:24:18.519981 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" Feb 17 09:24:19 crc kubenswrapper[4848]: I0217 09:24:19.033138 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7dcd758995-gt9wf"] Feb 17 09:24:19 crc kubenswrapper[4848]: I0217 09:24:19.966445 4848 generic.go:334] "Generic (PLEG): container finished" podID="87968358-75c5-4fb2-b3b1-8cf4e806611d" containerID="5c45d6a87fa7bb0068915bc14d1a6726f5eb3410cb0516f44364adadbd1668ed" exitCode=0 Feb 17 09:24:19 crc kubenswrapper[4848]: I0217 09:24:19.966558 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" event={"ID":"87968358-75c5-4fb2-b3b1-8cf4e806611d","Type":"ContainerDied","Data":"5c45d6a87fa7bb0068915bc14d1a6726f5eb3410cb0516f44364adadbd1668ed"} Feb 17 09:24:19 crc kubenswrapper[4848]: I0217 09:24:19.966814 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" event={"ID":"87968358-75c5-4fb2-b3b1-8cf4e806611d","Type":"ContainerStarted","Data":"2277c6a4a4e0b718cf9a65c4d061c702159c6ce7a0a6769b4aad8a37ca551a73"} Feb 17 09:24:20 crc kubenswrapper[4848]: I0217 09:24:20.114975 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:24:20 crc kubenswrapper[4848]: I0217 09:24:20.115369 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="47322ca8-f0b0-4d41-95e6-87a0dadb398f" containerName="ceilometer-central-agent" containerID="cri-o://2e03f64fd10a82ed995e1e37db36ad0c9a0e0f0d04660af4b70f5de082d24a60" gracePeriod=30 Feb 17 09:24:20 crc kubenswrapper[4848]: I0217 09:24:20.115499 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="47322ca8-f0b0-4d41-95e6-87a0dadb398f" containerName="ceilometer-notification-agent" containerID="cri-o://b02b0297b2b09f1a55e30656fbe3b9bbaf64381349bf32db409cec54b3941a01" gracePeriod=30 Feb 17 09:24:20 crc kubenswrapper[4848]: I0217 09:24:20.115521 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="47322ca8-f0b0-4d41-95e6-87a0dadb398f" containerName="sg-core" containerID="cri-o://7bcb06962eb299e81b2b72d6fa76ff704752b056d7354b11a38d158c4c275085" gracePeriod=30 Feb 17 09:24:20 crc kubenswrapper[4848]: I0217 09:24:20.115649 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="47322ca8-f0b0-4d41-95e6-87a0dadb398f" containerName="proxy-httpd" containerID="cri-o://2bd4f43ebe118f33d899658bfe42d19da1a070ab8a7b54a2409ca7bd35b6c9e4" gracePeriod=30 Feb 17 09:24:20 crc kubenswrapper[4848]: I0217 09:24:20.144574 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="47322ca8-f0b0-4d41-95e6-87a0dadb398f" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 17 09:24:20 crc kubenswrapper[4848]: I0217 09:24:20.295167 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 09:24:20 crc kubenswrapper[4848]: I0217 09:24:20.354288 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:24:20 crc kubenswrapper[4848]: I0217 09:24:20.980774 4848 generic.go:334] "Generic (PLEG): container finished" podID="47322ca8-f0b0-4d41-95e6-87a0dadb398f" containerID="2bd4f43ebe118f33d899658bfe42d19da1a070ab8a7b54a2409ca7bd35b6c9e4" exitCode=0 Feb 17 09:24:20 crc kubenswrapper[4848]: I0217 09:24:20.980814 4848 generic.go:334] "Generic (PLEG): container finished" podID="47322ca8-f0b0-4d41-95e6-87a0dadb398f" containerID="7bcb06962eb299e81b2b72d6fa76ff704752b056d7354b11a38d158c4c275085" exitCode=2 Feb 17 09:24:20 crc kubenswrapper[4848]: I0217 09:24:20.980828 4848 generic.go:334] "Generic (PLEG): container finished" podID="47322ca8-f0b0-4d41-95e6-87a0dadb398f" containerID="2e03f64fd10a82ed995e1e37db36ad0c9a0e0f0d04660af4b70f5de082d24a60" exitCode=0 Feb 17 09:24:20 crc kubenswrapper[4848]: I0217 09:24:20.980799 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47322ca8-f0b0-4d41-95e6-87a0dadb398f","Type":"ContainerDied","Data":"2bd4f43ebe118f33d899658bfe42d19da1a070ab8a7b54a2409ca7bd35b6c9e4"} Feb 17 09:24:20 crc kubenswrapper[4848]: I0217 09:24:20.981884 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47322ca8-f0b0-4d41-95e6-87a0dadb398f","Type":"ContainerDied","Data":"7bcb06962eb299e81b2b72d6fa76ff704752b056d7354b11a38d158c4c275085"} Feb 17 09:24:20 crc kubenswrapper[4848]: I0217 09:24:20.982055 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47322ca8-f0b0-4d41-95e6-87a0dadb398f","Type":"ContainerDied","Data":"2e03f64fd10a82ed995e1e37db36ad0c9a0e0f0d04660af4b70f5de082d24a60"} Feb 17 09:24:20 crc kubenswrapper[4848]: I0217 09:24:20.983200 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d10bd453-8470-4867-b631-e5beac75fd90" containerName="nova-api-log" containerID="cri-o://203c65eab09ab2b0f28075f95603cca6017bac03eeaca2e9b511eba19f4f07f1" gracePeriod=30 Feb 17 09:24:20 crc kubenswrapper[4848]: I0217 09:24:20.983915 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" event={"ID":"87968358-75c5-4fb2-b3b1-8cf4e806611d","Type":"ContainerStarted","Data":"812f20de746cd482b51b05aa2e429d52f314d3c43b6d7ba59ee7e82afbfcf845"} Feb 17 09:24:20 crc kubenswrapper[4848]: I0217 09:24:20.983944 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d10bd453-8470-4867-b631-e5beac75fd90" containerName="nova-api-api" containerID="cri-o://8eb2f53ca3bb2e2b5292e00d1fc1e7bbf9de9dc1cd68833823d078fe3d71f9de" gracePeriod=30 Feb 17 09:24:20 crc kubenswrapper[4848]: I0217 09:24:20.984067 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" Feb 17 09:24:21 crc kubenswrapper[4848]: I0217 09:24:21.017547 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" podStartSLOduration=3.017520062 podStartE2EDuration="3.017520062s" podCreationTimestamp="2026-02-17 09:24:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:24:21.011061728 +0000 UTC m=+1138.554317374" watchObservedRunningTime="2026-02-17 09:24:21.017520062 +0000 UTC m=+1138.560775748" Feb 17 09:24:21 crc kubenswrapper[4848]: I0217 09:24:21.993448 4848 generic.go:334] "Generic (PLEG): container finished" podID="d10bd453-8470-4867-b631-e5beac75fd90" containerID="203c65eab09ab2b0f28075f95603cca6017bac03eeaca2e9b511eba19f4f07f1" exitCode=143 Feb 17 09:24:21 crc kubenswrapper[4848]: I0217 09:24:21.993548 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d10bd453-8470-4867-b631-e5beac75fd90","Type":"ContainerDied","Data":"203c65eab09ab2b0f28075f95603cca6017bac03eeaca2e9b511eba19f4f07f1"} Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.006700 4848 generic.go:334] "Generic (PLEG): container finished" podID="47322ca8-f0b0-4d41-95e6-87a0dadb398f" containerID="b02b0297b2b09f1a55e30656fbe3b9bbaf64381349bf32db409cec54b3941a01" exitCode=0 Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.006902 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47322ca8-f0b0-4d41-95e6-87a0dadb398f","Type":"ContainerDied","Data":"b02b0297b2b09f1a55e30656fbe3b9bbaf64381349bf32db409cec54b3941a01"} Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.265665 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.415104 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-config-data\") pod \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.415171 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-sg-core-conf-yaml\") pod \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.415210 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47322ca8-f0b0-4d41-95e6-87a0dadb398f-log-httpd\") pod \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.415245 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-combined-ca-bundle\") pod \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.415380 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-ceilometer-tls-certs\") pod \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.415518 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47322ca8-f0b0-4d41-95e6-87a0dadb398f-run-httpd\") pod \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.415552 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-scripts\") pod \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.415611 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8vwd\" (UniqueName: \"kubernetes.io/projected/47322ca8-f0b0-4d41-95e6-87a0dadb398f-kube-api-access-g8vwd\") pod \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\" (UID: \"47322ca8-f0b0-4d41-95e6-87a0dadb398f\") " Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.415749 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47322ca8-f0b0-4d41-95e6-87a0dadb398f-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "47322ca8-f0b0-4d41-95e6-87a0dadb398f" (UID: "47322ca8-f0b0-4d41-95e6-87a0dadb398f"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.416242 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47322ca8-f0b0-4d41-95e6-87a0dadb398f-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "47322ca8-f0b0-4d41-95e6-87a0dadb398f" (UID: "47322ca8-f0b0-4d41-95e6-87a0dadb398f"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.416397 4848 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47322ca8-f0b0-4d41-95e6-87a0dadb398f-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.416418 4848 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/47322ca8-f0b0-4d41-95e6-87a0dadb398f-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.422076 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-scripts" (OuterVolumeSpecName: "scripts") pod "47322ca8-f0b0-4d41-95e6-87a0dadb398f" (UID: "47322ca8-f0b0-4d41-95e6-87a0dadb398f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.422148 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47322ca8-f0b0-4d41-95e6-87a0dadb398f-kube-api-access-g8vwd" (OuterVolumeSpecName: "kube-api-access-g8vwd") pod "47322ca8-f0b0-4d41-95e6-87a0dadb398f" (UID: "47322ca8-f0b0-4d41-95e6-87a0dadb398f"). InnerVolumeSpecName "kube-api-access-g8vwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.454953 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "47322ca8-f0b0-4d41-95e6-87a0dadb398f" (UID: "47322ca8-f0b0-4d41-95e6-87a0dadb398f"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.481978 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "47322ca8-f0b0-4d41-95e6-87a0dadb398f" (UID: "47322ca8-f0b0-4d41-95e6-87a0dadb398f"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.514856 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "47322ca8-f0b0-4d41-95e6-87a0dadb398f" (UID: "47322ca8-f0b0-4d41-95e6-87a0dadb398f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.518257 4848 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.518290 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.518303 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8vwd\" (UniqueName: \"kubernetes.io/projected/47322ca8-f0b0-4d41-95e6-87a0dadb398f-kube-api-access-g8vwd\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.518317 4848 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.518329 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.542952 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-config-data" (OuterVolumeSpecName: "config-data") pod "47322ca8-f0b0-4d41-95e6-87a0dadb398f" (UID: "47322ca8-f0b0-4d41-95e6-87a0dadb398f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:24:23 crc kubenswrapper[4848]: I0217 09:24:23.620566 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47322ca8-f0b0-4d41-95e6-87a0dadb398f-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.021322 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"47322ca8-f0b0-4d41-95e6-87a0dadb398f","Type":"ContainerDied","Data":"eb3bbe13ff28bd063a333423906017f3f73242fa288c12a082c05e4c6f4e4b6e"} Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.021388 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.021407 4848 scope.go:117] "RemoveContainer" containerID="2bd4f43ebe118f33d899658bfe42d19da1a070ab8a7b54a2409ca7bd35b6c9e4" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.069770 4848 scope.go:117] "RemoveContainer" containerID="7bcb06962eb299e81b2b72d6fa76ff704752b056d7354b11a38d158c4c275085" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.097010 4848 scope.go:117] "RemoveContainer" containerID="b02b0297b2b09f1a55e30656fbe3b9bbaf64381349bf32db409cec54b3941a01" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.100069 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.117924 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.142492 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:24:24 crc kubenswrapper[4848]: E0217 09:24:24.148263 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47322ca8-f0b0-4d41-95e6-87a0dadb398f" containerName="ceilometer-central-agent" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.148344 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="47322ca8-f0b0-4d41-95e6-87a0dadb398f" containerName="ceilometer-central-agent" Feb 17 09:24:24 crc kubenswrapper[4848]: E0217 09:24:24.148368 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47322ca8-f0b0-4d41-95e6-87a0dadb398f" containerName="sg-core" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.148418 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="47322ca8-f0b0-4d41-95e6-87a0dadb398f" containerName="sg-core" Feb 17 09:24:24 crc kubenswrapper[4848]: E0217 09:24:24.148452 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47322ca8-f0b0-4d41-95e6-87a0dadb398f" containerName="ceilometer-notification-agent" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.148265 4848 scope.go:117] "RemoveContainer" containerID="2e03f64fd10a82ed995e1e37db36ad0c9a0e0f0d04660af4b70f5de082d24a60" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.148669 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="47322ca8-f0b0-4d41-95e6-87a0dadb398f" containerName="ceilometer-notification-agent" Feb 17 09:24:24 crc kubenswrapper[4848]: E0217 09:24:24.148690 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47322ca8-f0b0-4d41-95e6-87a0dadb398f" containerName="proxy-httpd" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.148699 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="47322ca8-f0b0-4d41-95e6-87a0dadb398f" containerName="proxy-httpd" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.149434 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="47322ca8-f0b0-4d41-95e6-87a0dadb398f" containerName="sg-core" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.149474 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="47322ca8-f0b0-4d41-95e6-87a0dadb398f" containerName="ceilometer-notification-agent" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.149487 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="47322ca8-f0b0-4d41-95e6-87a0dadb398f" containerName="proxy-httpd" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.149505 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="47322ca8-f0b0-4d41-95e6-87a0dadb398f" containerName="ceilometer-central-agent" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.152152 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.152280 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.155889 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.156097 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.156246 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.336073 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65294004-8016-43fa-8017-90cb36bb8dcb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"65294004-8016-43fa-8017-90cb36bb8dcb\") " pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.336116 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65294004-8016-43fa-8017-90cb36bb8dcb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65294004-8016-43fa-8017-90cb36bb8dcb\") " pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.336145 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65294004-8016-43fa-8017-90cb36bb8dcb-config-data\") pod \"ceilometer-0\" (UID: \"65294004-8016-43fa-8017-90cb36bb8dcb\") " pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.336174 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65294004-8016-43fa-8017-90cb36bb8dcb-log-httpd\") pod \"ceilometer-0\" (UID: \"65294004-8016-43fa-8017-90cb36bb8dcb\") " pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.336400 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65294004-8016-43fa-8017-90cb36bb8dcb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65294004-8016-43fa-8017-90cb36bb8dcb\") " pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.336472 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4qvd\" (UniqueName: \"kubernetes.io/projected/65294004-8016-43fa-8017-90cb36bb8dcb-kube-api-access-x4qvd\") pod \"ceilometer-0\" (UID: \"65294004-8016-43fa-8017-90cb36bb8dcb\") " pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.336684 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65294004-8016-43fa-8017-90cb36bb8dcb-run-httpd\") pod \"ceilometer-0\" (UID: \"65294004-8016-43fa-8017-90cb36bb8dcb\") " pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.336738 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65294004-8016-43fa-8017-90cb36bb8dcb-scripts\") pod \"ceilometer-0\" (UID: \"65294004-8016-43fa-8017-90cb36bb8dcb\") " pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.437521 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4qvd\" (UniqueName: \"kubernetes.io/projected/65294004-8016-43fa-8017-90cb36bb8dcb-kube-api-access-x4qvd\") pod \"ceilometer-0\" (UID: \"65294004-8016-43fa-8017-90cb36bb8dcb\") " pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.437593 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65294004-8016-43fa-8017-90cb36bb8dcb-run-httpd\") pod \"ceilometer-0\" (UID: \"65294004-8016-43fa-8017-90cb36bb8dcb\") " pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.437618 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65294004-8016-43fa-8017-90cb36bb8dcb-scripts\") pod \"ceilometer-0\" (UID: \"65294004-8016-43fa-8017-90cb36bb8dcb\") " pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.437679 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65294004-8016-43fa-8017-90cb36bb8dcb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"65294004-8016-43fa-8017-90cb36bb8dcb\") " pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.437704 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65294004-8016-43fa-8017-90cb36bb8dcb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65294004-8016-43fa-8017-90cb36bb8dcb\") " pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.437730 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65294004-8016-43fa-8017-90cb36bb8dcb-config-data\") pod \"ceilometer-0\" (UID: \"65294004-8016-43fa-8017-90cb36bb8dcb\") " pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.437752 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65294004-8016-43fa-8017-90cb36bb8dcb-log-httpd\") pod \"ceilometer-0\" (UID: \"65294004-8016-43fa-8017-90cb36bb8dcb\") " pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.437796 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65294004-8016-43fa-8017-90cb36bb8dcb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65294004-8016-43fa-8017-90cb36bb8dcb\") " pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.441039 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65294004-8016-43fa-8017-90cb36bb8dcb-log-httpd\") pod \"ceilometer-0\" (UID: \"65294004-8016-43fa-8017-90cb36bb8dcb\") " pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.442875 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/65294004-8016-43fa-8017-90cb36bb8dcb-run-httpd\") pod \"ceilometer-0\" (UID: \"65294004-8016-43fa-8017-90cb36bb8dcb\") " pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.443252 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/65294004-8016-43fa-8017-90cb36bb8dcb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"65294004-8016-43fa-8017-90cb36bb8dcb\") " pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.443348 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/65294004-8016-43fa-8017-90cb36bb8dcb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"65294004-8016-43fa-8017-90cb36bb8dcb\") " pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.443616 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65294004-8016-43fa-8017-90cb36bb8dcb-scripts\") pod \"ceilometer-0\" (UID: \"65294004-8016-43fa-8017-90cb36bb8dcb\") " pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.444113 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65294004-8016-43fa-8017-90cb36bb8dcb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"65294004-8016-43fa-8017-90cb36bb8dcb\") " pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.446700 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65294004-8016-43fa-8017-90cb36bb8dcb-config-data\") pod \"ceilometer-0\" (UID: \"65294004-8016-43fa-8017-90cb36bb8dcb\") " pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.462274 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4qvd\" (UniqueName: \"kubernetes.io/projected/65294004-8016-43fa-8017-90cb36bb8dcb-kube-api-access-x4qvd\") pod \"ceilometer-0\" (UID: \"65294004-8016-43fa-8017-90cb36bb8dcb\") " pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.561520 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.567507 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.742869 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mn7f\" (UniqueName: \"kubernetes.io/projected/d10bd453-8470-4867-b631-e5beac75fd90-kube-api-access-7mn7f\") pod \"d10bd453-8470-4867-b631-e5beac75fd90\" (UID: \"d10bd453-8470-4867-b631-e5beac75fd90\") " Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.742956 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d10bd453-8470-4867-b631-e5beac75fd90-config-data\") pod \"d10bd453-8470-4867-b631-e5beac75fd90\" (UID: \"d10bd453-8470-4867-b631-e5beac75fd90\") " Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.743006 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d10bd453-8470-4867-b631-e5beac75fd90-combined-ca-bundle\") pod \"d10bd453-8470-4867-b631-e5beac75fd90\" (UID: \"d10bd453-8470-4867-b631-e5beac75fd90\") " Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.743099 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d10bd453-8470-4867-b631-e5beac75fd90-logs\") pod \"d10bd453-8470-4867-b631-e5beac75fd90\" (UID: \"d10bd453-8470-4867-b631-e5beac75fd90\") " Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.744195 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d10bd453-8470-4867-b631-e5beac75fd90-logs" (OuterVolumeSpecName: "logs") pod "d10bd453-8470-4867-b631-e5beac75fd90" (UID: "d10bd453-8470-4867-b631-e5beac75fd90"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.749903 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d10bd453-8470-4867-b631-e5beac75fd90-kube-api-access-7mn7f" (OuterVolumeSpecName: "kube-api-access-7mn7f") pod "d10bd453-8470-4867-b631-e5beac75fd90" (UID: "d10bd453-8470-4867-b631-e5beac75fd90"). InnerVolumeSpecName "kube-api-access-7mn7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.768989 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d10bd453-8470-4867-b631-e5beac75fd90-config-data" (OuterVolumeSpecName: "config-data") pod "d10bd453-8470-4867-b631-e5beac75fd90" (UID: "d10bd453-8470-4867-b631-e5beac75fd90"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.779645 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d10bd453-8470-4867-b631-e5beac75fd90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d10bd453-8470-4867-b631-e5beac75fd90" (UID: "d10bd453-8470-4867-b631-e5beac75fd90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.845017 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mn7f\" (UniqueName: \"kubernetes.io/projected/d10bd453-8470-4867-b631-e5beac75fd90-kube-api-access-7mn7f\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.845056 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d10bd453-8470-4867-b631-e5beac75fd90-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.845066 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d10bd453-8470-4867-b631-e5beac75fd90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:24 crc kubenswrapper[4848]: I0217 09:24:24.845075 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d10bd453-8470-4867-b631-e5beac75fd90-logs\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.030669 4848 generic.go:334] "Generic (PLEG): container finished" podID="d10bd453-8470-4867-b631-e5beac75fd90" containerID="8eb2f53ca3bb2e2b5292e00d1fc1e7bbf9de9dc1cd68833823d078fe3d71f9de" exitCode=0 Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.030756 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.030789 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d10bd453-8470-4867-b631-e5beac75fd90","Type":"ContainerDied","Data":"8eb2f53ca3bb2e2b5292e00d1fc1e7bbf9de9dc1cd68833823d078fe3d71f9de"} Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.030816 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d10bd453-8470-4867-b631-e5beac75fd90","Type":"ContainerDied","Data":"75db05ae649c9d99341f2916d022a383bba0636f13468dc0157d651c51f66a71"} Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.030832 4848 scope.go:117] "RemoveContainer" containerID="8eb2f53ca3bb2e2b5292e00d1fc1e7bbf9de9dc1cd68833823d078fe3d71f9de" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.069975 4848 scope.go:117] "RemoveContainer" containerID="203c65eab09ab2b0f28075f95603cca6017bac03eeaca2e9b511eba19f4f07f1" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.076522 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.091909 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.099659 4848 scope.go:117] "RemoveContainer" containerID="8eb2f53ca3bb2e2b5292e00d1fc1e7bbf9de9dc1cd68833823d078fe3d71f9de" Feb 17 09:24:25 crc kubenswrapper[4848]: E0217 09:24:25.109164 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8eb2f53ca3bb2e2b5292e00d1fc1e7bbf9de9dc1cd68833823d078fe3d71f9de\": container with ID starting with 8eb2f53ca3bb2e2b5292e00d1fc1e7bbf9de9dc1cd68833823d078fe3d71f9de not found: ID does not exist" containerID="8eb2f53ca3bb2e2b5292e00d1fc1e7bbf9de9dc1cd68833823d078fe3d71f9de" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.109242 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8eb2f53ca3bb2e2b5292e00d1fc1e7bbf9de9dc1cd68833823d078fe3d71f9de"} err="failed to get container status \"8eb2f53ca3bb2e2b5292e00d1fc1e7bbf9de9dc1cd68833823d078fe3d71f9de\": rpc error: code = NotFound desc = could not find container \"8eb2f53ca3bb2e2b5292e00d1fc1e7bbf9de9dc1cd68833823d078fe3d71f9de\": container with ID starting with 8eb2f53ca3bb2e2b5292e00d1fc1e7bbf9de9dc1cd68833823d078fe3d71f9de not found: ID does not exist" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.109275 4848 scope.go:117] "RemoveContainer" containerID="203c65eab09ab2b0f28075f95603cca6017bac03eeaca2e9b511eba19f4f07f1" Feb 17 09:24:25 crc kubenswrapper[4848]: E0217 09:24:25.110521 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"203c65eab09ab2b0f28075f95603cca6017bac03eeaca2e9b511eba19f4f07f1\": container with ID starting with 203c65eab09ab2b0f28075f95603cca6017bac03eeaca2e9b511eba19f4f07f1 not found: ID does not exist" containerID="203c65eab09ab2b0f28075f95603cca6017bac03eeaca2e9b511eba19f4f07f1" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.110545 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"203c65eab09ab2b0f28075f95603cca6017bac03eeaca2e9b511eba19f4f07f1"} err="failed to get container status \"203c65eab09ab2b0f28075f95603cca6017bac03eeaca2e9b511eba19f4f07f1\": rpc error: code = NotFound desc = could not find container \"203c65eab09ab2b0f28075f95603cca6017bac03eeaca2e9b511eba19f4f07f1\": container with ID starting with 203c65eab09ab2b0f28075f95603cca6017bac03eeaca2e9b511eba19f4f07f1 not found: ID does not exist" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.113831 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.136729 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 09:24:25 crc kubenswrapper[4848]: E0217 09:24:25.137290 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d10bd453-8470-4867-b631-e5beac75fd90" containerName="nova-api-log" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.151884 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="d10bd453-8470-4867-b631-e5beac75fd90" containerName="nova-api-log" Feb 17 09:24:25 crc kubenswrapper[4848]: E0217 09:24:25.152169 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d10bd453-8470-4867-b631-e5beac75fd90" containerName="nova-api-api" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.152226 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="d10bd453-8470-4867-b631-e5beac75fd90" containerName="nova-api-api" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.152538 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="d10bd453-8470-4867-b631-e5beac75fd90" containerName="nova-api-api" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.154248 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="d10bd453-8470-4867-b631-e5beac75fd90" containerName="nova-api-log" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.156719 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.157781 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.163671 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.164275 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.169146 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.353597 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c01330f-7266-4e3d-ad9b-0ce7966b5676-public-tls-certs\") pod \"nova-api-0\" (UID: \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\") " pod="openstack/nova-api-0" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.353708 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n2b2\" (UniqueName: \"kubernetes.io/projected/2c01330f-7266-4e3d-ad9b-0ce7966b5676-kube-api-access-5n2b2\") pod \"nova-api-0\" (UID: \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\") " pod="openstack/nova-api-0" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.353777 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c01330f-7266-4e3d-ad9b-0ce7966b5676-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\") " pod="openstack/nova-api-0" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.353852 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c01330f-7266-4e3d-ad9b-0ce7966b5676-logs\") pod \"nova-api-0\" (UID: \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\") " pod="openstack/nova-api-0" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.353881 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c01330f-7266-4e3d-ad9b-0ce7966b5676-config-data\") pod \"nova-api-0\" (UID: \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\") " pod="openstack/nova-api-0" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.353911 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c01330f-7266-4e3d-ad9b-0ce7966b5676-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\") " pod="openstack/nova-api-0" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.354099 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.371224 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.393632 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47322ca8-f0b0-4d41-95e6-87a0dadb398f" path="/var/lib/kubelet/pods/47322ca8-f0b0-4d41-95e6-87a0dadb398f/volumes" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.394708 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d10bd453-8470-4867-b631-e5beac75fd90" path="/var/lib/kubelet/pods/d10bd453-8470-4867-b631-e5beac75fd90/volumes" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.455586 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n2b2\" (UniqueName: \"kubernetes.io/projected/2c01330f-7266-4e3d-ad9b-0ce7966b5676-kube-api-access-5n2b2\") pod \"nova-api-0\" (UID: \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\") " pod="openstack/nova-api-0" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.455675 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c01330f-7266-4e3d-ad9b-0ce7966b5676-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\") " pod="openstack/nova-api-0" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.455781 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c01330f-7266-4e3d-ad9b-0ce7966b5676-logs\") pod \"nova-api-0\" (UID: \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\") " pod="openstack/nova-api-0" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.455826 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c01330f-7266-4e3d-ad9b-0ce7966b5676-config-data\") pod \"nova-api-0\" (UID: \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\") " pod="openstack/nova-api-0" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.455852 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c01330f-7266-4e3d-ad9b-0ce7966b5676-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\") " pod="openstack/nova-api-0" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.455906 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c01330f-7266-4e3d-ad9b-0ce7966b5676-public-tls-certs\") pod \"nova-api-0\" (UID: \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\") " pod="openstack/nova-api-0" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.456385 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c01330f-7266-4e3d-ad9b-0ce7966b5676-logs\") pod \"nova-api-0\" (UID: \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\") " pod="openstack/nova-api-0" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.461766 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c01330f-7266-4e3d-ad9b-0ce7966b5676-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\") " pod="openstack/nova-api-0" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.462283 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c01330f-7266-4e3d-ad9b-0ce7966b5676-public-tls-certs\") pod \"nova-api-0\" (UID: \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\") " pod="openstack/nova-api-0" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.463799 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c01330f-7266-4e3d-ad9b-0ce7966b5676-config-data\") pod \"nova-api-0\" (UID: \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\") " pod="openstack/nova-api-0" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.469296 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c01330f-7266-4e3d-ad9b-0ce7966b5676-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\") " pod="openstack/nova-api-0" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.477264 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n2b2\" (UniqueName: \"kubernetes.io/projected/2c01330f-7266-4e3d-ad9b-0ce7966b5676-kube-api-access-5n2b2\") pod \"nova-api-0\" (UID: \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\") " pod="openstack/nova-api-0" Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.492157 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 09:24:25 crc kubenswrapper[4848]: W0217 09:24:25.984537 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c01330f_7266_4e3d_ad9b_0ce7966b5676.slice/crio-71038928e48a2d743d977496f6c5928046e33a01ed85fb9ffb50667a372c001c WatchSource:0}: Error finding container 71038928e48a2d743d977496f6c5928046e33a01ed85fb9ffb50667a372c001c: Status 404 returned error can't find the container with id 71038928e48a2d743d977496f6c5928046e33a01ed85fb9ffb50667a372c001c Feb 17 09:24:25 crc kubenswrapper[4848]: I0217 09:24:25.993141 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 09:24:26 crc kubenswrapper[4848]: I0217 09:24:26.051946 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2c01330f-7266-4e3d-ad9b-0ce7966b5676","Type":"ContainerStarted","Data":"71038928e48a2d743d977496f6c5928046e33a01ed85fb9ffb50667a372c001c"} Feb 17 09:24:26 crc kubenswrapper[4848]: I0217 09:24:26.054456 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65294004-8016-43fa-8017-90cb36bb8dcb","Type":"ContainerStarted","Data":"aaa1b4d3aaeca25d297ef17cd212fc849b1cad118d1064aa15a93435d85c2b20"} Feb 17 09:24:26 crc kubenswrapper[4848]: I0217 09:24:26.054506 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65294004-8016-43fa-8017-90cb36bb8dcb","Type":"ContainerStarted","Data":"bc75931c5781a49964a68304c5f431c74e32c1131e9c7b0a20d4524d491acb26"} Feb 17 09:24:26 crc kubenswrapper[4848]: I0217 09:24:26.071613 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 17 09:24:26 crc kubenswrapper[4848]: I0217 09:24:26.254206 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-fkfjb"] Feb 17 09:24:26 crc kubenswrapper[4848]: I0217 09:24:26.256859 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fkfjb" Feb 17 09:24:26 crc kubenswrapper[4848]: I0217 09:24:26.259442 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 17 09:24:26 crc kubenswrapper[4848]: I0217 09:24:26.259590 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 17 09:24:26 crc kubenswrapper[4848]: I0217 09:24:26.264489 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-fkfjb"] Feb 17 09:24:26 crc kubenswrapper[4848]: I0217 09:24:26.375586 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a25dbc2d-c651-434c-b30b-0ee52c27d295-config-data\") pod \"nova-cell1-cell-mapping-fkfjb\" (UID: \"a25dbc2d-c651-434c-b30b-0ee52c27d295\") " pod="openstack/nova-cell1-cell-mapping-fkfjb" Feb 17 09:24:26 crc kubenswrapper[4848]: I0217 09:24:26.375683 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25dbc2d-c651-434c-b30b-0ee52c27d295-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fkfjb\" (UID: \"a25dbc2d-c651-434c-b30b-0ee52c27d295\") " pod="openstack/nova-cell1-cell-mapping-fkfjb" Feb 17 09:24:26 crc kubenswrapper[4848]: I0217 09:24:26.375805 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m5xk\" (UniqueName: \"kubernetes.io/projected/a25dbc2d-c651-434c-b30b-0ee52c27d295-kube-api-access-6m5xk\") pod \"nova-cell1-cell-mapping-fkfjb\" (UID: \"a25dbc2d-c651-434c-b30b-0ee52c27d295\") " pod="openstack/nova-cell1-cell-mapping-fkfjb" Feb 17 09:24:26 crc kubenswrapper[4848]: I0217 09:24:26.375835 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a25dbc2d-c651-434c-b30b-0ee52c27d295-scripts\") pod \"nova-cell1-cell-mapping-fkfjb\" (UID: \"a25dbc2d-c651-434c-b30b-0ee52c27d295\") " pod="openstack/nova-cell1-cell-mapping-fkfjb" Feb 17 09:24:26 crc kubenswrapper[4848]: I0217 09:24:26.477530 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a25dbc2d-c651-434c-b30b-0ee52c27d295-config-data\") pod \"nova-cell1-cell-mapping-fkfjb\" (UID: \"a25dbc2d-c651-434c-b30b-0ee52c27d295\") " pod="openstack/nova-cell1-cell-mapping-fkfjb" Feb 17 09:24:26 crc kubenswrapper[4848]: I0217 09:24:26.477817 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25dbc2d-c651-434c-b30b-0ee52c27d295-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fkfjb\" (UID: \"a25dbc2d-c651-434c-b30b-0ee52c27d295\") " pod="openstack/nova-cell1-cell-mapping-fkfjb" Feb 17 09:24:26 crc kubenswrapper[4848]: I0217 09:24:26.478373 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m5xk\" (UniqueName: \"kubernetes.io/projected/a25dbc2d-c651-434c-b30b-0ee52c27d295-kube-api-access-6m5xk\") pod \"nova-cell1-cell-mapping-fkfjb\" (UID: \"a25dbc2d-c651-434c-b30b-0ee52c27d295\") " pod="openstack/nova-cell1-cell-mapping-fkfjb" Feb 17 09:24:26 crc kubenswrapper[4848]: I0217 09:24:26.478443 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a25dbc2d-c651-434c-b30b-0ee52c27d295-scripts\") pod \"nova-cell1-cell-mapping-fkfjb\" (UID: \"a25dbc2d-c651-434c-b30b-0ee52c27d295\") " pod="openstack/nova-cell1-cell-mapping-fkfjb" Feb 17 09:24:26 crc kubenswrapper[4848]: I0217 09:24:26.483143 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25dbc2d-c651-434c-b30b-0ee52c27d295-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-fkfjb\" (UID: \"a25dbc2d-c651-434c-b30b-0ee52c27d295\") " pod="openstack/nova-cell1-cell-mapping-fkfjb" Feb 17 09:24:26 crc kubenswrapper[4848]: I0217 09:24:26.483164 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a25dbc2d-c651-434c-b30b-0ee52c27d295-scripts\") pod \"nova-cell1-cell-mapping-fkfjb\" (UID: \"a25dbc2d-c651-434c-b30b-0ee52c27d295\") " pod="openstack/nova-cell1-cell-mapping-fkfjb" Feb 17 09:24:26 crc kubenswrapper[4848]: I0217 09:24:26.483218 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a25dbc2d-c651-434c-b30b-0ee52c27d295-config-data\") pod \"nova-cell1-cell-mapping-fkfjb\" (UID: \"a25dbc2d-c651-434c-b30b-0ee52c27d295\") " pod="openstack/nova-cell1-cell-mapping-fkfjb" Feb 17 09:24:26 crc kubenswrapper[4848]: I0217 09:24:26.499534 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m5xk\" (UniqueName: \"kubernetes.io/projected/a25dbc2d-c651-434c-b30b-0ee52c27d295-kube-api-access-6m5xk\") pod \"nova-cell1-cell-mapping-fkfjb\" (UID: \"a25dbc2d-c651-434c-b30b-0ee52c27d295\") " pod="openstack/nova-cell1-cell-mapping-fkfjb" Feb 17 09:24:26 crc kubenswrapper[4848]: I0217 09:24:26.607795 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fkfjb" Feb 17 09:24:27 crc kubenswrapper[4848]: I0217 09:24:27.066949 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65294004-8016-43fa-8017-90cb36bb8dcb","Type":"ContainerStarted","Data":"80a42c93230d95b48d581886e2d5200b942ba64ca31ab74f3289b8ee722461f4"} Feb 17 09:24:27 crc kubenswrapper[4848]: I0217 09:24:27.067253 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65294004-8016-43fa-8017-90cb36bb8dcb","Type":"ContainerStarted","Data":"fc46e9e3d2c61f941fa2505c154933d65a21027cd0fbefee6a00f1ae7fff1124"} Feb 17 09:24:27 crc kubenswrapper[4848]: I0217 09:24:27.070694 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2c01330f-7266-4e3d-ad9b-0ce7966b5676","Type":"ContainerStarted","Data":"72a838f613d4d7f65347edf39ad7e02da7bd0271240b931ce972d4a276974d53"} Feb 17 09:24:27 crc kubenswrapper[4848]: I0217 09:24:27.070723 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2c01330f-7266-4e3d-ad9b-0ce7966b5676","Type":"ContainerStarted","Data":"e51320227111e59b6ce0d6622fc2b59a719f7697ef7e53ab88f8ee5ca7636c19"} Feb 17 09:24:27 crc kubenswrapper[4848]: I0217 09:24:27.112462 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-fkfjb"] Feb 17 09:24:27 crc kubenswrapper[4848]: I0217 09:24:27.119103 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.119080658 podStartE2EDuration="2.119080658s" podCreationTimestamp="2026-02-17 09:24:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:24:27.108199024 +0000 UTC m=+1144.651454680" watchObservedRunningTime="2026-02-17 09:24:27.119080658 +0000 UTC m=+1144.662336314" Feb 17 09:24:28 crc kubenswrapper[4848]: I0217 09:24:28.079711 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fkfjb" event={"ID":"a25dbc2d-c651-434c-b30b-0ee52c27d295","Type":"ContainerStarted","Data":"c44e7dddfdf0890d05394d3acecbc3114bdcde1a72f2117b6e7be9ca20880852"} Feb 17 09:24:28 crc kubenswrapper[4848]: I0217 09:24:28.080104 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fkfjb" event={"ID":"a25dbc2d-c651-434c-b30b-0ee52c27d295","Type":"ContainerStarted","Data":"6a1d5c3cf444896d51411b346a22ea09780716272e4cc7cbe509d3f1983df089"} Feb 17 09:24:28 crc kubenswrapper[4848]: I0217 09:24:28.103137 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-fkfjb" podStartSLOduration=2.103111686 podStartE2EDuration="2.103111686s" podCreationTimestamp="2026-02-17 09:24:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:24:28.097516515 +0000 UTC m=+1145.640772161" watchObservedRunningTime="2026-02-17 09:24:28.103111686 +0000 UTC m=+1145.646367332" Feb 17 09:24:28 crc kubenswrapper[4848]: I0217 09:24:28.522562 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" Feb 17 09:24:28 crc kubenswrapper[4848]: I0217 09:24:28.579163 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc699f5c5-b64x5"] Feb 17 09:24:28 crc kubenswrapper[4848]: I0217 09:24:28.579415 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" podUID="fe686229-24bb-4b7c-aab8-64f78e05d1f1" containerName="dnsmasq-dns" containerID="cri-o://602cd7a4b20ac413d3192147119f005fe6c9216e11594610180690750bd6da1b" gracePeriod=10 Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.091500 4848 generic.go:334] "Generic (PLEG): container finished" podID="fe686229-24bb-4b7c-aab8-64f78e05d1f1" containerID="602cd7a4b20ac413d3192147119f005fe6c9216e11594610180690750bd6da1b" exitCode=0 Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.091568 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" event={"ID":"fe686229-24bb-4b7c-aab8-64f78e05d1f1","Type":"ContainerDied","Data":"602cd7a4b20ac413d3192147119f005fe6c9216e11594610180690750bd6da1b"} Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.091820 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" event={"ID":"fe686229-24bb-4b7c-aab8-64f78e05d1f1","Type":"ContainerDied","Data":"514c2df53dfeb0334871b82f56dc63f6efb7abb92afaeb314b0d6d5072614d38"} Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.091837 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="514c2df53dfeb0334871b82f56dc63f6efb7abb92afaeb314b0d6d5072614d38" Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.098375 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"65294004-8016-43fa-8017-90cb36bb8dcb","Type":"ContainerStarted","Data":"9fbd2b5b8375c75627b6bef00702bb3aabe34f251fd1a2c2087bff4628d06f51"} Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.098435 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.103410 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.122881 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.255769867 podStartE2EDuration="5.12285807s" podCreationTimestamp="2026-02-17 09:24:24 +0000 UTC" firstStartedPulling="2026-02-17 09:24:25.109162348 +0000 UTC m=+1142.652417994" lastFinishedPulling="2026-02-17 09:24:27.976250551 +0000 UTC m=+1145.519506197" observedRunningTime="2026-02-17 09:24:29.115167462 +0000 UTC m=+1146.658423118" watchObservedRunningTime="2026-02-17 09:24:29.12285807 +0000 UTC m=+1146.666113726" Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.234510 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-dns-svc\") pod \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\" (UID: \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\") " Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.234590 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jnq2p\" (UniqueName: \"kubernetes.io/projected/fe686229-24bb-4b7c-aab8-64f78e05d1f1-kube-api-access-jnq2p\") pod \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\" (UID: \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\") " Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.234657 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-ovsdbserver-sb\") pod \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\" (UID: \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\") " Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.234678 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-ovsdbserver-nb\") pod \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\" (UID: \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\") " Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.234747 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-config\") pod \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\" (UID: \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\") " Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.234845 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-dns-swift-storage-0\") pod \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\" (UID: \"fe686229-24bb-4b7c-aab8-64f78e05d1f1\") " Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.256984 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe686229-24bb-4b7c-aab8-64f78e05d1f1-kube-api-access-jnq2p" (OuterVolumeSpecName: "kube-api-access-jnq2p") pod "fe686229-24bb-4b7c-aab8-64f78e05d1f1" (UID: "fe686229-24bb-4b7c-aab8-64f78e05d1f1"). InnerVolumeSpecName "kube-api-access-jnq2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.285980 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fe686229-24bb-4b7c-aab8-64f78e05d1f1" (UID: "fe686229-24bb-4b7c-aab8-64f78e05d1f1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.293936 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fe686229-24bb-4b7c-aab8-64f78e05d1f1" (UID: "fe686229-24bb-4b7c-aab8-64f78e05d1f1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.306248 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-config" (OuterVolumeSpecName: "config") pod "fe686229-24bb-4b7c-aab8-64f78e05d1f1" (UID: "fe686229-24bb-4b7c-aab8-64f78e05d1f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.306894 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fe686229-24bb-4b7c-aab8-64f78e05d1f1" (UID: "fe686229-24bb-4b7c-aab8-64f78e05d1f1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.309947 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fe686229-24bb-4b7c-aab8-64f78e05d1f1" (UID: "fe686229-24bb-4b7c-aab8-64f78e05d1f1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.336805 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.336842 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jnq2p\" (UniqueName: \"kubernetes.io/projected/fe686229-24bb-4b7c-aab8-64f78e05d1f1-kube-api-access-jnq2p\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.336858 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.336873 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.336886 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:29 crc kubenswrapper[4848]: I0217 09:24:29.336898 4848 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fe686229-24bb-4b7c-aab8-64f78e05d1f1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:30 crc kubenswrapper[4848]: I0217 09:24:30.108445 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bc699f5c5-b64x5" Feb 17 09:24:30 crc kubenswrapper[4848]: I0217 09:24:30.141379 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bc699f5c5-b64x5"] Feb 17 09:24:30 crc kubenswrapper[4848]: I0217 09:24:30.157282 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bc699f5c5-b64x5"] Feb 17 09:24:31 crc kubenswrapper[4848]: I0217 09:24:31.395557 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe686229-24bb-4b7c-aab8-64f78e05d1f1" path="/var/lib/kubelet/pods/fe686229-24bb-4b7c-aab8-64f78e05d1f1/volumes" Feb 17 09:24:31 crc kubenswrapper[4848]: E0217 09:24:31.912676 4848 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda25dbc2d_c651_434c_b30b_0ee52c27d295.slice/crio-c44e7dddfdf0890d05394d3acecbc3114bdcde1a72f2117b6e7be9ca20880852.scope\": RecentStats: unable to find data in memory cache]" Feb 17 09:24:32 crc kubenswrapper[4848]: I0217 09:24:32.131897 4848 generic.go:334] "Generic (PLEG): container finished" podID="a25dbc2d-c651-434c-b30b-0ee52c27d295" containerID="c44e7dddfdf0890d05394d3acecbc3114bdcde1a72f2117b6e7be9ca20880852" exitCode=0 Feb 17 09:24:32 crc kubenswrapper[4848]: I0217 09:24:32.132216 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fkfjb" event={"ID":"a25dbc2d-c651-434c-b30b-0ee52c27d295","Type":"ContainerDied","Data":"c44e7dddfdf0890d05394d3acecbc3114bdcde1a72f2117b6e7be9ca20880852"} Feb 17 09:24:33 crc kubenswrapper[4848]: I0217 09:24:33.617064 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fkfjb" Feb 17 09:24:33 crc kubenswrapper[4848]: I0217 09:24:33.723491 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25dbc2d-c651-434c-b30b-0ee52c27d295-combined-ca-bundle\") pod \"a25dbc2d-c651-434c-b30b-0ee52c27d295\" (UID: \"a25dbc2d-c651-434c-b30b-0ee52c27d295\") " Feb 17 09:24:33 crc kubenswrapper[4848]: I0217 09:24:33.723627 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a25dbc2d-c651-434c-b30b-0ee52c27d295-config-data\") pod \"a25dbc2d-c651-434c-b30b-0ee52c27d295\" (UID: \"a25dbc2d-c651-434c-b30b-0ee52c27d295\") " Feb 17 09:24:33 crc kubenswrapper[4848]: I0217 09:24:33.723674 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6m5xk\" (UniqueName: \"kubernetes.io/projected/a25dbc2d-c651-434c-b30b-0ee52c27d295-kube-api-access-6m5xk\") pod \"a25dbc2d-c651-434c-b30b-0ee52c27d295\" (UID: \"a25dbc2d-c651-434c-b30b-0ee52c27d295\") " Feb 17 09:24:33 crc kubenswrapper[4848]: I0217 09:24:33.723780 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a25dbc2d-c651-434c-b30b-0ee52c27d295-scripts\") pod \"a25dbc2d-c651-434c-b30b-0ee52c27d295\" (UID: \"a25dbc2d-c651-434c-b30b-0ee52c27d295\") " Feb 17 09:24:33 crc kubenswrapper[4848]: I0217 09:24:33.740632 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a25dbc2d-c651-434c-b30b-0ee52c27d295-kube-api-access-6m5xk" (OuterVolumeSpecName: "kube-api-access-6m5xk") pod "a25dbc2d-c651-434c-b30b-0ee52c27d295" (UID: "a25dbc2d-c651-434c-b30b-0ee52c27d295"). InnerVolumeSpecName "kube-api-access-6m5xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:24:33 crc kubenswrapper[4848]: I0217 09:24:33.741158 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a25dbc2d-c651-434c-b30b-0ee52c27d295-scripts" (OuterVolumeSpecName: "scripts") pod "a25dbc2d-c651-434c-b30b-0ee52c27d295" (UID: "a25dbc2d-c651-434c-b30b-0ee52c27d295"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:24:33 crc kubenswrapper[4848]: I0217 09:24:33.751423 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a25dbc2d-c651-434c-b30b-0ee52c27d295-config-data" (OuterVolumeSpecName: "config-data") pod "a25dbc2d-c651-434c-b30b-0ee52c27d295" (UID: "a25dbc2d-c651-434c-b30b-0ee52c27d295"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:24:33 crc kubenswrapper[4848]: I0217 09:24:33.752715 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a25dbc2d-c651-434c-b30b-0ee52c27d295-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a25dbc2d-c651-434c-b30b-0ee52c27d295" (UID: "a25dbc2d-c651-434c-b30b-0ee52c27d295"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:24:33 crc kubenswrapper[4848]: I0217 09:24:33.826561 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a25dbc2d-c651-434c-b30b-0ee52c27d295-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:33 crc kubenswrapper[4848]: I0217 09:24:33.826608 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6m5xk\" (UniqueName: \"kubernetes.io/projected/a25dbc2d-c651-434c-b30b-0ee52c27d295-kube-api-access-6m5xk\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:33 crc kubenswrapper[4848]: I0217 09:24:33.826642 4848 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a25dbc2d-c651-434c-b30b-0ee52c27d295-scripts\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:33 crc kubenswrapper[4848]: I0217 09:24:33.826654 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a25dbc2d-c651-434c-b30b-0ee52c27d295-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:34 crc kubenswrapper[4848]: I0217 09:24:34.154266 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-fkfjb" event={"ID":"a25dbc2d-c651-434c-b30b-0ee52c27d295","Type":"ContainerDied","Data":"6a1d5c3cf444896d51411b346a22ea09780716272e4cc7cbe509d3f1983df089"} Feb 17 09:24:34 crc kubenswrapper[4848]: I0217 09:24:34.154573 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a1d5c3cf444896d51411b346a22ea09780716272e4cc7cbe509d3f1983df089" Feb 17 09:24:34 crc kubenswrapper[4848]: I0217 09:24:34.154332 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-fkfjb" Feb 17 09:24:34 crc kubenswrapper[4848]: I0217 09:24:34.353995 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 09:24:34 crc kubenswrapper[4848]: I0217 09:24:34.354213 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2c01330f-7266-4e3d-ad9b-0ce7966b5676" containerName="nova-api-log" containerID="cri-o://e51320227111e59b6ce0d6622fc2b59a719f7697ef7e53ab88f8ee5ca7636c19" gracePeriod=30 Feb 17 09:24:34 crc kubenswrapper[4848]: I0217 09:24:34.354367 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2c01330f-7266-4e3d-ad9b-0ce7966b5676" containerName="nova-api-api" containerID="cri-o://72a838f613d4d7f65347edf39ad7e02da7bd0271240b931ce972d4a276974d53" gracePeriod=30 Feb 17 09:24:34 crc kubenswrapper[4848]: I0217 09:24:34.370811 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 09:24:34 crc kubenswrapper[4848]: I0217 09:24:34.371071 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2f1b1861-5e43-493d-a3de-9686bac8fb14" containerName="nova-scheduler-scheduler" containerID="cri-o://364bf0e2ffc5335bddc0ff70dade4dc242ace3db5f8935f044cf8e39e5b10cbb" gracePeriod=30 Feb 17 09:24:34 crc kubenswrapper[4848]: I0217 09:24:34.382113 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 09:24:34 crc kubenswrapper[4848]: I0217 09:24:34.382823 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="71d697a6-85b3-4801-8521-c72a378cbed0" containerName="nova-metadata-log" containerID="cri-o://35e1bc5319f656b679e649c2f3db1a5c380f56716ba083cbe71268e5a8f337c6" gracePeriod=30 Feb 17 09:24:34 crc kubenswrapper[4848]: I0217 09:24:34.383130 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="71d697a6-85b3-4801-8521-c72a378cbed0" containerName="nova-metadata-metadata" containerID="cri-o://170ec1f977a1bcdd85999328d83122ee1b715ebb6e69c1813f9e8199a7656dc4" gracePeriod=30 Feb 17 09:24:34 crc kubenswrapper[4848]: I0217 09:24:34.881790 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.054801 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5n2b2\" (UniqueName: \"kubernetes.io/projected/2c01330f-7266-4e3d-ad9b-0ce7966b5676-kube-api-access-5n2b2\") pod \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\" (UID: \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\") " Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.055253 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c01330f-7266-4e3d-ad9b-0ce7966b5676-internal-tls-certs\") pod \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\" (UID: \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\") " Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.056845 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c01330f-7266-4e3d-ad9b-0ce7966b5676-logs\") pod \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\" (UID: \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\") " Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.057128 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c01330f-7266-4e3d-ad9b-0ce7966b5676-config-data\") pod \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\" (UID: \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\") " Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.057185 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c01330f-7266-4e3d-ad9b-0ce7966b5676-combined-ca-bundle\") pod \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\" (UID: \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\") " Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.057279 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c01330f-7266-4e3d-ad9b-0ce7966b5676-public-tls-certs\") pod \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\" (UID: \"2c01330f-7266-4e3d-ad9b-0ce7966b5676\") " Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.057451 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c01330f-7266-4e3d-ad9b-0ce7966b5676-logs" (OuterVolumeSpecName: "logs") pod "2c01330f-7266-4e3d-ad9b-0ce7966b5676" (UID: "2c01330f-7266-4e3d-ad9b-0ce7966b5676"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.057984 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c01330f-7266-4e3d-ad9b-0ce7966b5676-logs\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.065790 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c01330f-7266-4e3d-ad9b-0ce7966b5676-kube-api-access-5n2b2" (OuterVolumeSpecName: "kube-api-access-5n2b2") pod "2c01330f-7266-4e3d-ad9b-0ce7966b5676" (UID: "2c01330f-7266-4e3d-ad9b-0ce7966b5676"). InnerVolumeSpecName "kube-api-access-5n2b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.104667 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c01330f-7266-4e3d-ad9b-0ce7966b5676-config-data" (OuterVolumeSpecName: "config-data") pod "2c01330f-7266-4e3d-ad9b-0ce7966b5676" (UID: "2c01330f-7266-4e3d-ad9b-0ce7966b5676"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.109121 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c01330f-7266-4e3d-ad9b-0ce7966b5676-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c01330f-7266-4e3d-ad9b-0ce7966b5676" (UID: "2c01330f-7266-4e3d-ad9b-0ce7966b5676"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.111662 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c01330f-7266-4e3d-ad9b-0ce7966b5676-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2c01330f-7266-4e3d-ad9b-0ce7966b5676" (UID: "2c01330f-7266-4e3d-ad9b-0ce7966b5676"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.121123 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c01330f-7266-4e3d-ad9b-0ce7966b5676-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2c01330f-7266-4e3d-ad9b-0ce7966b5676" (UID: "2c01330f-7266-4e3d-ad9b-0ce7966b5676"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.159253 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5n2b2\" (UniqueName: \"kubernetes.io/projected/2c01330f-7266-4e3d-ad9b-0ce7966b5676-kube-api-access-5n2b2\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.159292 4848 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c01330f-7266-4e3d-ad9b-0ce7966b5676-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.159301 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c01330f-7266-4e3d-ad9b-0ce7966b5676-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.159311 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c01330f-7266-4e3d-ad9b-0ce7966b5676-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.159319 4848 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2c01330f-7266-4e3d-ad9b-0ce7966b5676-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.165313 4848 generic.go:334] "Generic (PLEG): container finished" podID="2c01330f-7266-4e3d-ad9b-0ce7966b5676" containerID="72a838f613d4d7f65347edf39ad7e02da7bd0271240b931ce972d4a276974d53" exitCode=0 Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.165345 4848 generic.go:334] "Generic (PLEG): container finished" podID="2c01330f-7266-4e3d-ad9b-0ce7966b5676" containerID="e51320227111e59b6ce0d6622fc2b59a719f7697ef7e53ab88f8ee5ca7636c19" exitCode=143 Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.165379 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2c01330f-7266-4e3d-ad9b-0ce7966b5676","Type":"ContainerDied","Data":"72a838f613d4d7f65347edf39ad7e02da7bd0271240b931ce972d4a276974d53"} Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.165404 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2c01330f-7266-4e3d-ad9b-0ce7966b5676","Type":"ContainerDied","Data":"e51320227111e59b6ce0d6622fc2b59a719f7697ef7e53ab88f8ee5ca7636c19"} Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.165413 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2c01330f-7266-4e3d-ad9b-0ce7966b5676","Type":"ContainerDied","Data":"71038928e48a2d743d977496f6c5928046e33a01ed85fb9ffb50667a372c001c"} Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.165430 4848 scope.go:117] "RemoveContainer" containerID="72a838f613d4d7f65347edf39ad7e02da7bd0271240b931ce972d4a276974d53" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.165540 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.173130 4848 generic.go:334] "Generic (PLEG): container finished" podID="71d697a6-85b3-4801-8521-c72a378cbed0" containerID="35e1bc5319f656b679e649c2f3db1a5c380f56716ba083cbe71268e5a8f337c6" exitCode=143 Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.173193 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71d697a6-85b3-4801-8521-c72a378cbed0","Type":"ContainerDied","Data":"35e1bc5319f656b679e649c2f3db1a5c380f56716ba083cbe71268e5a8f337c6"} Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.200514 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.201977 4848 scope.go:117] "RemoveContainer" containerID="e51320227111e59b6ce0d6622fc2b59a719f7697ef7e53ab88f8ee5ca7636c19" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.210106 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.229245 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 17 09:24:35 crc kubenswrapper[4848]: E0217 09:24:35.229610 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe686229-24bb-4b7c-aab8-64f78e05d1f1" containerName="init" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.229629 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe686229-24bb-4b7c-aab8-64f78e05d1f1" containerName="init" Feb 17 09:24:35 crc kubenswrapper[4848]: E0217 09:24:35.229645 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c01330f-7266-4e3d-ad9b-0ce7966b5676" containerName="nova-api-log" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.229652 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c01330f-7266-4e3d-ad9b-0ce7966b5676" containerName="nova-api-log" Feb 17 09:24:35 crc kubenswrapper[4848]: E0217 09:24:35.229678 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe686229-24bb-4b7c-aab8-64f78e05d1f1" containerName="dnsmasq-dns" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.229684 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe686229-24bb-4b7c-aab8-64f78e05d1f1" containerName="dnsmasq-dns" Feb 17 09:24:35 crc kubenswrapper[4848]: E0217 09:24:35.229697 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a25dbc2d-c651-434c-b30b-0ee52c27d295" containerName="nova-manage" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.229702 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a25dbc2d-c651-434c-b30b-0ee52c27d295" containerName="nova-manage" Feb 17 09:24:35 crc kubenswrapper[4848]: E0217 09:24:35.229716 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c01330f-7266-4e3d-ad9b-0ce7966b5676" containerName="nova-api-api" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.229722 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c01330f-7266-4e3d-ad9b-0ce7966b5676" containerName="nova-api-api" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.231069 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c01330f-7266-4e3d-ad9b-0ce7966b5676" containerName="nova-api-api" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.231098 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe686229-24bb-4b7c-aab8-64f78e05d1f1" containerName="dnsmasq-dns" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.231109 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c01330f-7266-4e3d-ad9b-0ce7966b5676" containerName="nova-api-log" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.231122 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="a25dbc2d-c651-434c-b30b-0ee52c27d295" containerName="nova-manage" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.232128 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.236587 4848 scope.go:117] "RemoveContainer" containerID="72a838f613d4d7f65347edf39ad7e02da7bd0271240b931ce972d4a276974d53" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.237027 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.237074 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.237023 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 17 09:24:35 crc kubenswrapper[4848]: E0217 09:24:35.237411 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72a838f613d4d7f65347edf39ad7e02da7bd0271240b931ce972d4a276974d53\": container with ID starting with 72a838f613d4d7f65347edf39ad7e02da7bd0271240b931ce972d4a276974d53 not found: ID does not exist" containerID="72a838f613d4d7f65347edf39ad7e02da7bd0271240b931ce972d4a276974d53" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.237463 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72a838f613d4d7f65347edf39ad7e02da7bd0271240b931ce972d4a276974d53"} err="failed to get container status \"72a838f613d4d7f65347edf39ad7e02da7bd0271240b931ce972d4a276974d53\": rpc error: code = NotFound desc = could not find container \"72a838f613d4d7f65347edf39ad7e02da7bd0271240b931ce972d4a276974d53\": container with ID starting with 72a838f613d4d7f65347edf39ad7e02da7bd0271240b931ce972d4a276974d53 not found: ID does not exist" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.237491 4848 scope.go:117] "RemoveContainer" containerID="e51320227111e59b6ce0d6622fc2b59a719f7697ef7e53ab88f8ee5ca7636c19" Feb 17 09:24:35 crc kubenswrapper[4848]: E0217 09:24:35.237742 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e51320227111e59b6ce0d6622fc2b59a719f7697ef7e53ab88f8ee5ca7636c19\": container with ID starting with e51320227111e59b6ce0d6622fc2b59a719f7697ef7e53ab88f8ee5ca7636c19 not found: ID does not exist" containerID="e51320227111e59b6ce0d6622fc2b59a719f7697ef7e53ab88f8ee5ca7636c19" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.238444 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e51320227111e59b6ce0d6622fc2b59a719f7697ef7e53ab88f8ee5ca7636c19"} err="failed to get container status \"e51320227111e59b6ce0d6622fc2b59a719f7697ef7e53ab88f8ee5ca7636c19\": rpc error: code = NotFound desc = could not find container \"e51320227111e59b6ce0d6622fc2b59a719f7697ef7e53ab88f8ee5ca7636c19\": container with ID starting with e51320227111e59b6ce0d6622fc2b59a719f7697ef7e53ab88f8ee5ca7636c19 not found: ID does not exist" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.238467 4848 scope.go:117] "RemoveContainer" containerID="72a838f613d4d7f65347edf39ad7e02da7bd0271240b931ce972d4a276974d53" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.238697 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72a838f613d4d7f65347edf39ad7e02da7bd0271240b931ce972d4a276974d53"} err="failed to get container status \"72a838f613d4d7f65347edf39ad7e02da7bd0271240b931ce972d4a276974d53\": rpc error: code = NotFound desc = could not find container \"72a838f613d4d7f65347edf39ad7e02da7bd0271240b931ce972d4a276974d53\": container with ID starting with 72a838f613d4d7f65347edf39ad7e02da7bd0271240b931ce972d4a276974d53 not found: ID does not exist" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.238741 4848 scope.go:117] "RemoveContainer" containerID="e51320227111e59b6ce0d6622fc2b59a719f7697ef7e53ab88f8ee5ca7636c19" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.241015 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e51320227111e59b6ce0d6622fc2b59a719f7697ef7e53ab88f8ee5ca7636c19"} err="failed to get container status \"e51320227111e59b6ce0d6622fc2b59a719f7697ef7e53ab88f8ee5ca7636c19\": rpc error: code = NotFound desc = could not find container \"e51320227111e59b6ce0d6622fc2b59a719f7697ef7e53ab88f8ee5ca7636c19\": container with ID starting with e51320227111e59b6ce0d6622fc2b59a719f7697ef7e53ab88f8ee5ca7636c19 not found: ID does not exist" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.251864 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.365315 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7-config-data\") pod \"nova-api-0\" (UID: \"3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7\") " pod="openstack/nova-api-0" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.365409 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7-logs\") pod \"nova-api-0\" (UID: \"3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7\") " pod="openstack/nova-api-0" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.365463 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcb59\" (UniqueName: \"kubernetes.io/projected/3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7-kube-api-access-tcb59\") pod \"nova-api-0\" (UID: \"3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7\") " pod="openstack/nova-api-0" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.365528 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7\") " pod="openstack/nova-api-0" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.365597 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7\") " pod="openstack/nova-api-0" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.365635 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7-public-tls-certs\") pod \"nova-api-0\" (UID: \"3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7\") " pod="openstack/nova-api-0" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.394177 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c01330f-7266-4e3d-ad9b-0ce7966b5676" path="/var/lib/kubelet/pods/2c01330f-7266-4e3d-ad9b-0ce7966b5676/volumes" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.467134 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcb59\" (UniqueName: \"kubernetes.io/projected/3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7-kube-api-access-tcb59\") pod \"nova-api-0\" (UID: \"3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7\") " pod="openstack/nova-api-0" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.467200 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7\") " pod="openstack/nova-api-0" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.467251 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7\") " pod="openstack/nova-api-0" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.467848 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7-public-tls-certs\") pod \"nova-api-0\" (UID: \"3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7\") " pod="openstack/nova-api-0" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.467917 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7-config-data\") pod \"nova-api-0\" (UID: \"3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7\") " pod="openstack/nova-api-0" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.467958 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7-logs\") pod \"nova-api-0\" (UID: \"3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7\") " pod="openstack/nova-api-0" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.468344 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7-logs\") pod \"nova-api-0\" (UID: \"3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7\") " pod="openstack/nova-api-0" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.470514 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7-internal-tls-certs\") pod \"nova-api-0\" (UID: \"3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7\") " pod="openstack/nova-api-0" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.470534 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7\") " pod="openstack/nova-api-0" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.471495 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7-config-data\") pod \"nova-api-0\" (UID: \"3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7\") " pod="openstack/nova-api-0" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.471511 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7-public-tls-certs\") pod \"nova-api-0\" (UID: \"3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7\") " pod="openstack/nova-api-0" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.487527 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcb59\" (UniqueName: \"kubernetes.io/projected/3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7-kube-api-access-tcb59\") pod \"nova-api-0\" (UID: \"3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7\") " pod="openstack/nova-api-0" Feb 17 09:24:35 crc kubenswrapper[4848]: I0217 09:24:35.549207 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 17 09:24:36 crc kubenswrapper[4848]: E0217 09:24:36.029063 4848 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="364bf0e2ffc5335bddc0ff70dade4dc242ace3db5f8935f044cf8e39e5b10cbb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 09:24:36 crc kubenswrapper[4848]: E0217 09:24:36.031673 4848 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="364bf0e2ffc5335bddc0ff70dade4dc242ace3db5f8935f044cf8e39e5b10cbb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 09:24:36 crc kubenswrapper[4848]: E0217 09:24:36.033847 4848 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="364bf0e2ffc5335bddc0ff70dade4dc242ace3db5f8935f044cf8e39e5b10cbb" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 17 09:24:36 crc kubenswrapper[4848]: E0217 09:24:36.033888 4848 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="2f1b1861-5e43-493d-a3de-9686bac8fb14" containerName="nova-scheduler-scheduler" Feb 17 09:24:36 crc kubenswrapper[4848]: I0217 09:24:36.159993 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 17 09:24:36 crc kubenswrapper[4848]: I0217 09:24:36.184280 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7","Type":"ContainerStarted","Data":"810ad64ebe4d9110d21034a26e0cd0ce560f3dc6dcad3d18d8a4dd88c25eaed6"} Feb 17 09:24:37 crc kubenswrapper[4848]: I0217 09:24:37.204275 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7","Type":"ContainerStarted","Data":"e2e81cbf306620cb59fba7e365c5e82259637da183e1d6231e5a321d20d06f3c"} Feb 17 09:24:37 crc kubenswrapper[4848]: I0217 09:24:37.204643 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7","Type":"ContainerStarted","Data":"5d8502ea4c94c343dcf92a36c0cdf3613355515f0d1f9cd85d90e1019ebda392"} Feb 17 09:24:37 crc kubenswrapper[4848]: I0217 09:24:37.240225 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.240206183 podStartE2EDuration="2.240206183s" podCreationTimestamp="2026-02-17 09:24:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:24:37.238346743 +0000 UTC m=+1154.781602379" watchObservedRunningTime="2026-02-17 09:24:37.240206183 +0000 UTC m=+1154.783461829" Feb 17 09:24:37 crc kubenswrapper[4848]: I0217 09:24:37.523170 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="71d697a6-85b3-4801-8521-c72a378cbed0" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:42248->10.217.0.197:8775: read: connection reset by peer" Feb 17 09:24:37 crc kubenswrapper[4848]: I0217 09:24:37.523183 4848 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="71d697a6-85b3-4801-8521-c72a378cbed0" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.197:8775/\": read tcp 10.217.0.2:42242->10.217.0.197:8775: read: connection reset by peer" Feb 17 09:24:37 crc kubenswrapper[4848]: I0217 09:24:37.989831 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.154319 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxrkg\" (UniqueName: \"kubernetes.io/projected/71d697a6-85b3-4801-8521-c72a378cbed0-kube-api-access-zxrkg\") pod \"71d697a6-85b3-4801-8521-c72a378cbed0\" (UID: \"71d697a6-85b3-4801-8521-c72a378cbed0\") " Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.154696 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71d697a6-85b3-4801-8521-c72a378cbed0-config-data\") pod \"71d697a6-85b3-4801-8521-c72a378cbed0\" (UID: \"71d697a6-85b3-4801-8521-c72a378cbed0\") " Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.154737 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71d697a6-85b3-4801-8521-c72a378cbed0-combined-ca-bundle\") pod \"71d697a6-85b3-4801-8521-c72a378cbed0\" (UID: \"71d697a6-85b3-4801-8521-c72a378cbed0\") " Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.154879 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71d697a6-85b3-4801-8521-c72a378cbed0-logs\") pod \"71d697a6-85b3-4801-8521-c72a378cbed0\" (UID: \"71d697a6-85b3-4801-8521-c72a378cbed0\") " Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.154922 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71d697a6-85b3-4801-8521-c72a378cbed0-nova-metadata-tls-certs\") pod \"71d697a6-85b3-4801-8521-c72a378cbed0\" (UID: \"71d697a6-85b3-4801-8521-c72a378cbed0\") " Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.156074 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71d697a6-85b3-4801-8521-c72a378cbed0-logs" (OuterVolumeSpecName: "logs") pod "71d697a6-85b3-4801-8521-c72a378cbed0" (UID: "71d697a6-85b3-4801-8521-c72a378cbed0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.160010 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71d697a6-85b3-4801-8521-c72a378cbed0-kube-api-access-zxrkg" (OuterVolumeSpecName: "kube-api-access-zxrkg") pod "71d697a6-85b3-4801-8521-c72a378cbed0" (UID: "71d697a6-85b3-4801-8521-c72a378cbed0"). InnerVolumeSpecName "kube-api-access-zxrkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.189560 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71d697a6-85b3-4801-8521-c72a378cbed0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71d697a6-85b3-4801-8521-c72a378cbed0" (UID: "71d697a6-85b3-4801-8521-c72a378cbed0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.211092 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71d697a6-85b3-4801-8521-c72a378cbed0-config-data" (OuterVolumeSpecName: "config-data") pod "71d697a6-85b3-4801-8521-c72a378cbed0" (UID: "71d697a6-85b3-4801-8521-c72a378cbed0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.216595 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71d697a6-85b3-4801-8521-c72a378cbed0-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "71d697a6-85b3-4801-8521-c72a378cbed0" (UID: "71d697a6-85b3-4801-8521-c72a378cbed0"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.216957 4848 generic.go:334] "Generic (PLEG): container finished" podID="71d697a6-85b3-4801-8521-c72a378cbed0" containerID="170ec1f977a1bcdd85999328d83122ee1b715ebb6e69c1813f9e8199a7656dc4" exitCode=0 Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.217032 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.217063 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71d697a6-85b3-4801-8521-c72a378cbed0","Type":"ContainerDied","Data":"170ec1f977a1bcdd85999328d83122ee1b715ebb6e69c1813f9e8199a7656dc4"} Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.217101 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"71d697a6-85b3-4801-8521-c72a378cbed0","Type":"ContainerDied","Data":"60104c804a9be6f4ad92f4c22895d851f04632814a2d995069bbb01d456add3c"} Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.217123 4848 scope.go:117] "RemoveContainer" containerID="170ec1f977a1bcdd85999328d83122ee1b715ebb6e69c1813f9e8199a7656dc4" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.257331 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxrkg\" (UniqueName: \"kubernetes.io/projected/71d697a6-85b3-4801-8521-c72a378cbed0-kube-api-access-zxrkg\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.257379 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71d697a6-85b3-4801-8521-c72a378cbed0-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.257399 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71d697a6-85b3-4801-8521-c72a378cbed0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.257411 4848 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/71d697a6-85b3-4801-8521-c72a378cbed0-logs\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.257424 4848 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/71d697a6-85b3-4801-8521-c72a378cbed0-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.303820 4848 scope.go:117] "RemoveContainer" containerID="35e1bc5319f656b679e649c2f3db1a5c380f56716ba083cbe71268e5a8f337c6" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.303964 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.317265 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.338155 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 17 09:24:38 crc kubenswrapper[4848]: E0217 09:24:38.338627 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71d697a6-85b3-4801-8521-c72a378cbed0" containerName="nova-metadata-log" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.338648 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="71d697a6-85b3-4801-8521-c72a378cbed0" containerName="nova-metadata-log" Feb 17 09:24:38 crc kubenswrapper[4848]: E0217 09:24:38.338668 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71d697a6-85b3-4801-8521-c72a378cbed0" containerName="nova-metadata-metadata" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.338678 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="71d697a6-85b3-4801-8521-c72a378cbed0" containerName="nova-metadata-metadata" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.338927 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="71d697a6-85b3-4801-8521-c72a378cbed0" containerName="nova-metadata-log" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.338950 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="71d697a6-85b3-4801-8521-c72a378cbed0" containerName="nova-metadata-metadata" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.340099 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.344219 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.344895 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.346014 4848 scope.go:117] "RemoveContainer" containerID="170ec1f977a1bcdd85999328d83122ee1b715ebb6e69c1813f9e8199a7656dc4" Feb 17 09:24:38 crc kubenswrapper[4848]: E0217 09:24:38.346397 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"170ec1f977a1bcdd85999328d83122ee1b715ebb6e69c1813f9e8199a7656dc4\": container with ID starting with 170ec1f977a1bcdd85999328d83122ee1b715ebb6e69c1813f9e8199a7656dc4 not found: ID does not exist" containerID="170ec1f977a1bcdd85999328d83122ee1b715ebb6e69c1813f9e8199a7656dc4" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.346429 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"170ec1f977a1bcdd85999328d83122ee1b715ebb6e69c1813f9e8199a7656dc4"} err="failed to get container status \"170ec1f977a1bcdd85999328d83122ee1b715ebb6e69c1813f9e8199a7656dc4\": rpc error: code = NotFound desc = could not find container \"170ec1f977a1bcdd85999328d83122ee1b715ebb6e69c1813f9e8199a7656dc4\": container with ID starting with 170ec1f977a1bcdd85999328d83122ee1b715ebb6e69c1813f9e8199a7656dc4 not found: ID does not exist" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.346455 4848 scope.go:117] "RemoveContainer" containerID="35e1bc5319f656b679e649c2f3db1a5c380f56716ba083cbe71268e5a8f337c6" Feb 17 09:24:38 crc kubenswrapper[4848]: E0217 09:24:38.346815 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35e1bc5319f656b679e649c2f3db1a5c380f56716ba083cbe71268e5a8f337c6\": container with ID starting with 35e1bc5319f656b679e649c2f3db1a5c380f56716ba083cbe71268e5a8f337c6 not found: ID does not exist" containerID="35e1bc5319f656b679e649c2f3db1a5c380f56716ba083cbe71268e5a8f337c6" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.346844 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35e1bc5319f656b679e649c2f3db1a5c380f56716ba083cbe71268e5a8f337c6"} err="failed to get container status \"35e1bc5319f656b679e649c2f3db1a5c380f56716ba083cbe71268e5a8f337c6\": rpc error: code = NotFound desc = could not find container \"35e1bc5319f656b679e649c2f3db1a5c380f56716ba083cbe71268e5a8f337c6\": container with ID starting with 35e1bc5319f656b679e649c2f3db1a5c380f56716ba083cbe71268e5a8f337c6 not found: ID does not exist" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.358815 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28ef5b18-4ce7-4850-b6c8-70e0727fc805-config-data\") pod \"nova-metadata-0\" (UID: \"28ef5b18-4ce7-4850-b6c8-70e0727fc805\") " pod="openstack/nova-metadata-0" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.358897 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck974\" (UniqueName: \"kubernetes.io/projected/28ef5b18-4ce7-4850-b6c8-70e0727fc805-kube-api-access-ck974\") pod \"nova-metadata-0\" (UID: \"28ef5b18-4ce7-4850-b6c8-70e0727fc805\") " pod="openstack/nova-metadata-0" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.358921 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28ef5b18-4ce7-4850-b6c8-70e0727fc805-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"28ef5b18-4ce7-4850-b6c8-70e0727fc805\") " pod="openstack/nova-metadata-0" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.358965 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/28ef5b18-4ce7-4850-b6c8-70e0727fc805-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"28ef5b18-4ce7-4850-b6c8-70e0727fc805\") " pod="openstack/nova-metadata-0" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.358993 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28ef5b18-4ce7-4850-b6c8-70e0727fc805-logs\") pod \"nova-metadata-0\" (UID: \"28ef5b18-4ce7-4850-b6c8-70e0727fc805\") " pod="openstack/nova-metadata-0" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.363772 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.460700 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck974\" (UniqueName: \"kubernetes.io/projected/28ef5b18-4ce7-4850-b6c8-70e0727fc805-kube-api-access-ck974\") pod \"nova-metadata-0\" (UID: \"28ef5b18-4ce7-4850-b6c8-70e0727fc805\") " pod="openstack/nova-metadata-0" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.461079 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28ef5b18-4ce7-4850-b6c8-70e0727fc805-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"28ef5b18-4ce7-4850-b6c8-70e0727fc805\") " pod="openstack/nova-metadata-0" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.461237 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/28ef5b18-4ce7-4850-b6c8-70e0727fc805-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"28ef5b18-4ce7-4850-b6c8-70e0727fc805\") " pod="openstack/nova-metadata-0" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.461346 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28ef5b18-4ce7-4850-b6c8-70e0727fc805-logs\") pod \"nova-metadata-0\" (UID: \"28ef5b18-4ce7-4850-b6c8-70e0727fc805\") " pod="openstack/nova-metadata-0" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.461619 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28ef5b18-4ce7-4850-b6c8-70e0727fc805-config-data\") pod \"nova-metadata-0\" (UID: \"28ef5b18-4ce7-4850-b6c8-70e0727fc805\") " pod="openstack/nova-metadata-0" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.462280 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28ef5b18-4ce7-4850-b6c8-70e0727fc805-logs\") pod \"nova-metadata-0\" (UID: \"28ef5b18-4ce7-4850-b6c8-70e0727fc805\") " pod="openstack/nova-metadata-0" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.464557 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28ef5b18-4ce7-4850-b6c8-70e0727fc805-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"28ef5b18-4ce7-4850-b6c8-70e0727fc805\") " pod="openstack/nova-metadata-0" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.464953 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/28ef5b18-4ce7-4850-b6c8-70e0727fc805-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"28ef5b18-4ce7-4850-b6c8-70e0727fc805\") " pod="openstack/nova-metadata-0" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.465188 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28ef5b18-4ce7-4850-b6c8-70e0727fc805-config-data\") pod \"nova-metadata-0\" (UID: \"28ef5b18-4ce7-4850-b6c8-70e0727fc805\") " pod="openstack/nova-metadata-0" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.489395 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck974\" (UniqueName: \"kubernetes.io/projected/28ef5b18-4ce7-4850-b6c8-70e0727fc805-kube-api-access-ck974\") pod \"nova-metadata-0\" (UID: \"28ef5b18-4ce7-4850-b6c8-70e0727fc805\") " pod="openstack/nova-metadata-0" Feb 17 09:24:38 crc kubenswrapper[4848]: I0217 09:24:38.668492 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 17 09:24:39 crc kubenswrapper[4848]: I0217 09:24:39.144578 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 17 09:24:39 crc kubenswrapper[4848]: I0217 09:24:39.228128 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"28ef5b18-4ce7-4850-b6c8-70e0727fc805","Type":"ContainerStarted","Data":"248f6fca91ed3f08f1a9d872d5c0974bc03724241aa92c15e6d59ae5b88e9505"} Feb 17 09:24:39 crc kubenswrapper[4848]: I0217 09:24:39.397176 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71d697a6-85b3-4801-8521-c72a378cbed0" path="/var/lib/kubelet/pods/71d697a6-85b3-4801-8521-c72a378cbed0/volumes" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.023080 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.095683 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f1b1861-5e43-493d-a3de-9686bac8fb14-config-data\") pod \"2f1b1861-5e43-493d-a3de-9686bac8fb14\" (UID: \"2f1b1861-5e43-493d-a3de-9686bac8fb14\") " Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.095797 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f1b1861-5e43-493d-a3de-9686bac8fb14-combined-ca-bundle\") pod \"2f1b1861-5e43-493d-a3de-9686bac8fb14\" (UID: \"2f1b1861-5e43-493d-a3de-9686bac8fb14\") " Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.096008 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mb22\" (UniqueName: \"kubernetes.io/projected/2f1b1861-5e43-493d-a3de-9686bac8fb14-kube-api-access-8mb22\") pod \"2f1b1861-5e43-493d-a3de-9686bac8fb14\" (UID: \"2f1b1861-5e43-493d-a3de-9686bac8fb14\") " Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.102812 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f1b1861-5e43-493d-a3de-9686bac8fb14-kube-api-access-8mb22" (OuterVolumeSpecName: "kube-api-access-8mb22") pod "2f1b1861-5e43-493d-a3de-9686bac8fb14" (UID: "2f1b1861-5e43-493d-a3de-9686bac8fb14"). InnerVolumeSpecName "kube-api-access-8mb22". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.138369 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f1b1861-5e43-493d-a3de-9686bac8fb14-config-data" (OuterVolumeSpecName: "config-data") pod "2f1b1861-5e43-493d-a3de-9686bac8fb14" (UID: "2f1b1861-5e43-493d-a3de-9686bac8fb14"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.139906 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f1b1861-5e43-493d-a3de-9686bac8fb14-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2f1b1861-5e43-493d-a3de-9686bac8fb14" (UID: "2f1b1861-5e43-493d-a3de-9686bac8fb14"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.198493 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2f1b1861-5e43-493d-a3de-9686bac8fb14-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.198528 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2f1b1861-5e43-493d-a3de-9686bac8fb14-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.198542 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mb22\" (UniqueName: \"kubernetes.io/projected/2f1b1861-5e43-493d-a3de-9686bac8fb14-kube-api-access-8mb22\") on node \"crc\" DevicePath \"\"" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.247922 4848 generic.go:334] "Generic (PLEG): container finished" podID="2f1b1861-5e43-493d-a3de-9686bac8fb14" containerID="364bf0e2ffc5335bddc0ff70dade4dc242ace3db5f8935f044cf8e39e5b10cbb" exitCode=0 Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.248018 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2f1b1861-5e43-493d-a3de-9686bac8fb14","Type":"ContainerDied","Data":"364bf0e2ffc5335bddc0ff70dade4dc242ace3db5f8935f044cf8e39e5b10cbb"} Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.248033 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.248048 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2f1b1861-5e43-493d-a3de-9686bac8fb14","Type":"ContainerDied","Data":"69c6ded57a6924b8ddd4f92b763938d01d6e449840912c5cc8621e2a0e4aecab"} Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.248070 4848 scope.go:117] "RemoveContainer" containerID="364bf0e2ffc5335bddc0ff70dade4dc242ace3db5f8935f044cf8e39e5b10cbb" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.250376 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"28ef5b18-4ce7-4850-b6c8-70e0727fc805","Type":"ContainerStarted","Data":"6a2805d769d12830f3047f4703e49705597c2f804b496a26f5f4faff88d1ed2e"} Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.250414 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"28ef5b18-4ce7-4850-b6c8-70e0727fc805","Type":"ContainerStarted","Data":"16a6e4017e6d9f0f9f1bb15ccd7819d40b3db9dd5e5518945566108b1fdfa438"} Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.282448 4848 scope.go:117] "RemoveContainer" containerID="364bf0e2ffc5335bddc0ff70dade4dc242ace3db5f8935f044cf8e39e5b10cbb" Feb 17 09:24:40 crc kubenswrapper[4848]: E0217 09:24:40.283012 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"364bf0e2ffc5335bddc0ff70dade4dc242ace3db5f8935f044cf8e39e5b10cbb\": container with ID starting with 364bf0e2ffc5335bddc0ff70dade4dc242ace3db5f8935f044cf8e39e5b10cbb not found: ID does not exist" containerID="364bf0e2ffc5335bddc0ff70dade4dc242ace3db5f8935f044cf8e39e5b10cbb" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.283051 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"364bf0e2ffc5335bddc0ff70dade4dc242ace3db5f8935f044cf8e39e5b10cbb"} err="failed to get container status \"364bf0e2ffc5335bddc0ff70dade4dc242ace3db5f8935f044cf8e39e5b10cbb\": rpc error: code = NotFound desc = could not find container \"364bf0e2ffc5335bddc0ff70dade4dc242ace3db5f8935f044cf8e39e5b10cbb\": container with ID starting with 364bf0e2ffc5335bddc0ff70dade4dc242ace3db5f8935f044cf8e39e5b10cbb not found: ID does not exist" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.287153 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.287133002 podStartE2EDuration="2.287133002s" podCreationTimestamp="2026-02-17 09:24:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:24:40.270303197 +0000 UTC m=+1157.813558853" watchObservedRunningTime="2026-02-17 09:24:40.287133002 +0000 UTC m=+1157.830388658" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.312916 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.330129 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.339608 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 09:24:40 crc kubenswrapper[4848]: E0217 09:24:40.340013 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f1b1861-5e43-493d-a3de-9686bac8fb14" containerName="nova-scheduler-scheduler" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.340028 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f1b1861-5e43-493d-a3de-9686bac8fb14" containerName="nova-scheduler-scheduler" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.340211 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f1b1861-5e43-493d-a3de-9686bac8fb14" containerName="nova-scheduler-scheduler" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.340842 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.342852 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.404112 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627e2420-02b2-4269-adc9-573cd91cccd9-config-data\") pod \"nova-scheduler-0\" (UID: \"627e2420-02b2-4269-adc9-573cd91cccd9\") " pod="openstack/nova-scheduler-0" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.404179 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcc6q\" (UniqueName: \"kubernetes.io/projected/627e2420-02b2-4269-adc9-573cd91cccd9-kube-api-access-jcc6q\") pod \"nova-scheduler-0\" (UID: \"627e2420-02b2-4269-adc9-573cd91cccd9\") " pod="openstack/nova-scheduler-0" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.404327 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627e2420-02b2-4269-adc9-573cd91cccd9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"627e2420-02b2-4269-adc9-573cd91cccd9\") " pod="openstack/nova-scheduler-0" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.421231 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.505660 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627e2420-02b2-4269-adc9-573cd91cccd9-config-data\") pod \"nova-scheduler-0\" (UID: \"627e2420-02b2-4269-adc9-573cd91cccd9\") " pod="openstack/nova-scheduler-0" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.506052 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcc6q\" (UniqueName: \"kubernetes.io/projected/627e2420-02b2-4269-adc9-573cd91cccd9-kube-api-access-jcc6q\") pod \"nova-scheduler-0\" (UID: \"627e2420-02b2-4269-adc9-573cd91cccd9\") " pod="openstack/nova-scheduler-0" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.506185 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627e2420-02b2-4269-adc9-573cd91cccd9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"627e2420-02b2-4269-adc9-573cd91cccd9\") " pod="openstack/nova-scheduler-0" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.510179 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/627e2420-02b2-4269-adc9-573cd91cccd9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"627e2420-02b2-4269-adc9-573cd91cccd9\") " pod="openstack/nova-scheduler-0" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.510648 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/627e2420-02b2-4269-adc9-573cd91cccd9-config-data\") pod \"nova-scheduler-0\" (UID: \"627e2420-02b2-4269-adc9-573cd91cccd9\") " pod="openstack/nova-scheduler-0" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.527581 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcc6q\" (UniqueName: \"kubernetes.io/projected/627e2420-02b2-4269-adc9-573cd91cccd9-kube-api-access-jcc6q\") pod \"nova-scheduler-0\" (UID: \"627e2420-02b2-4269-adc9-573cd91cccd9\") " pod="openstack/nova-scheduler-0" Feb 17 09:24:40 crc kubenswrapper[4848]: I0217 09:24:40.722890 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 17 09:24:41 crc kubenswrapper[4848]: I0217 09:24:41.193354 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 17 09:24:41 crc kubenswrapper[4848]: I0217 09:24:41.260927 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"627e2420-02b2-4269-adc9-573cd91cccd9","Type":"ContainerStarted","Data":"f5a200e58d342c618d48c94f3c63a23ed2b0616b39238a4fe87388effe3ead30"} Feb 17 09:24:41 crc kubenswrapper[4848]: I0217 09:24:41.398348 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f1b1861-5e43-493d-a3de-9686bac8fb14" path="/var/lib/kubelet/pods/2f1b1861-5e43-493d-a3de-9686bac8fb14/volumes" Feb 17 09:24:42 crc kubenswrapper[4848]: I0217 09:24:42.276022 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"627e2420-02b2-4269-adc9-573cd91cccd9","Type":"ContainerStarted","Data":"6b262ff3baf8739427bac3445d53472d1ee4375a24350118b850f36b381813aa"} Feb 17 09:24:42 crc kubenswrapper[4848]: I0217 09:24:42.312933 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.312907298 podStartE2EDuration="2.312907298s" podCreationTimestamp="2026-02-17 09:24:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:24:42.295816727 +0000 UTC m=+1159.839072383" watchObservedRunningTime="2026-02-17 09:24:42.312907298 +0000 UTC m=+1159.856162984" Feb 17 09:24:43 crc kubenswrapper[4848]: I0217 09:24:43.668961 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 09:24:43 crc kubenswrapper[4848]: I0217 09:24:43.669364 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 17 09:24:45 crc kubenswrapper[4848]: I0217 09:24:45.549746 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 09:24:45 crc kubenswrapper[4848]: I0217 09:24:45.550132 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 17 09:24:45 crc kubenswrapper[4848]: I0217 09:24:45.723257 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 17 09:24:46 crc kubenswrapper[4848]: I0217 09:24:46.566025 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 09:24:46 crc kubenswrapper[4848]: I0217 09:24:46.566064 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 09:24:48 crc kubenswrapper[4848]: I0217 09:24:48.675255 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 09:24:48 crc kubenswrapper[4848]: I0217 09:24:48.675613 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 17 09:24:49 crc kubenswrapper[4848]: I0217 09:24:49.689930 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="28ef5b18-4ce7-4850-b6c8-70e0727fc805" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 09:24:49 crc kubenswrapper[4848]: I0217 09:24:49.689946 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="28ef5b18-4ce7-4850-b6c8-70e0727fc805" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.208:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 17 09:24:50 crc kubenswrapper[4848]: I0217 09:24:50.723217 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 17 09:24:50 crc kubenswrapper[4848]: I0217 09:24:50.760998 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 17 09:24:51 crc kubenswrapper[4848]: I0217 09:24:51.420694 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 17 09:24:54 crc kubenswrapper[4848]: I0217 09:24:54.575524 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 17 09:24:55 crc kubenswrapper[4848]: I0217 09:24:55.559024 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 09:24:55 crc kubenswrapper[4848]: I0217 09:24:55.560223 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 09:24:55 crc kubenswrapper[4848]: I0217 09:24:55.560666 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 17 09:24:55 crc kubenswrapper[4848]: I0217 09:24:55.572306 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 09:24:56 crc kubenswrapper[4848]: I0217 09:24:56.459797 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 17 09:24:56 crc kubenswrapper[4848]: I0217 09:24:56.472177 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 17 09:24:58 crc kubenswrapper[4848]: I0217 09:24:58.677721 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 09:24:58 crc kubenswrapper[4848]: I0217 09:24:58.679569 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 17 09:24:58 crc kubenswrapper[4848]: I0217 09:24:58.688478 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 09:24:59 crc kubenswrapper[4848]: I0217 09:24:59.496810 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 17 09:25:07 crc kubenswrapper[4848]: I0217 09:25:07.022792 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 09:25:07 crc kubenswrapper[4848]: I0217 09:25:07.817605 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 09:25:11 crc kubenswrapper[4848]: I0217 09:25:11.234793 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="db50eaa9-ca0a-4a83-98d8-fce82f849d91" containerName="rabbitmq" containerID="cri-o://89c4b7e4ebba0e5a926ad66e26e6c279acfd3e58d484e76f931ec2ced0b778c0" gracePeriod=604796 Feb 17 09:25:12 crc kubenswrapper[4848]: I0217 09:25:12.085387 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="bd7e9b9b-99f0-4720-b997-3f00996972e5" containerName="rabbitmq" containerID="cri-o://db8066450874afb857859ed156351663263e471d9f54d2e2df0c4b84bad94d39" gracePeriod=604796 Feb 17 09:25:17 crc kubenswrapper[4848]: I0217 09:25:17.675847 4848 generic.go:334] "Generic (PLEG): container finished" podID="db50eaa9-ca0a-4a83-98d8-fce82f849d91" containerID="89c4b7e4ebba0e5a926ad66e26e6c279acfd3e58d484e76f931ec2ced0b778c0" exitCode=0 Feb 17 09:25:17 crc kubenswrapper[4848]: I0217 09:25:17.675919 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"db50eaa9-ca0a-4a83-98d8-fce82f849d91","Type":"ContainerDied","Data":"89c4b7e4ebba0e5a926ad66e26e6c279acfd3e58d484e76f931ec2ced0b778c0"} Feb 17 09:25:17 crc kubenswrapper[4848]: I0217 09:25:17.879833 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 09:25:17 crc kubenswrapper[4848]: I0217 09:25:17.948603 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhcn9\" (UniqueName: \"kubernetes.io/projected/db50eaa9-ca0a-4a83-98d8-fce82f849d91-kube-api-access-xhcn9\") pod \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " Feb 17 09:25:17 crc kubenswrapper[4848]: I0217 09:25:17.948699 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " Feb 17 09:25:17 crc kubenswrapper[4848]: I0217 09:25:17.948929 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/db50eaa9-ca0a-4a83-98d8-fce82f849d91-rabbitmq-tls\") pod \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " Feb 17 09:25:17 crc kubenswrapper[4848]: I0217 09:25:17.948969 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/db50eaa9-ca0a-4a83-98d8-fce82f849d91-rabbitmq-confd\") pod \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " Feb 17 09:25:17 crc kubenswrapper[4848]: I0217 09:25:17.949059 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/db50eaa9-ca0a-4a83-98d8-fce82f849d91-rabbitmq-plugins\") pod \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " Feb 17 09:25:17 crc kubenswrapper[4848]: I0217 09:25:17.949096 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/db50eaa9-ca0a-4a83-98d8-fce82f849d91-rabbitmq-erlang-cookie\") pod \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " Feb 17 09:25:17 crc kubenswrapper[4848]: I0217 09:25:17.949134 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/db50eaa9-ca0a-4a83-98d8-fce82f849d91-server-conf\") pod \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " Feb 17 09:25:17 crc kubenswrapper[4848]: I0217 09:25:17.949233 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db50eaa9-ca0a-4a83-98d8-fce82f849d91-config-data\") pod \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " Feb 17 09:25:17 crc kubenswrapper[4848]: I0217 09:25:17.949297 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/db50eaa9-ca0a-4a83-98d8-fce82f849d91-erlang-cookie-secret\") pod \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " Feb 17 09:25:17 crc kubenswrapper[4848]: I0217 09:25:17.949347 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/db50eaa9-ca0a-4a83-98d8-fce82f849d91-pod-info\") pod \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " Feb 17 09:25:17 crc kubenswrapper[4848]: I0217 09:25:17.949398 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/db50eaa9-ca0a-4a83-98d8-fce82f849d91-plugins-conf\") pod \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\" (UID: \"db50eaa9-ca0a-4a83-98d8-fce82f849d91\") " Feb 17 09:25:17 crc kubenswrapper[4848]: I0217 09:25:17.950473 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db50eaa9-ca0a-4a83-98d8-fce82f849d91-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "db50eaa9-ca0a-4a83-98d8-fce82f849d91" (UID: "db50eaa9-ca0a-4a83-98d8-fce82f849d91"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:25:17 crc kubenswrapper[4848]: I0217 09:25:17.950688 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db50eaa9-ca0a-4a83-98d8-fce82f849d91-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "db50eaa9-ca0a-4a83-98d8-fce82f849d91" (UID: "db50eaa9-ca0a-4a83-98d8-fce82f849d91"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:25:17 crc kubenswrapper[4848]: I0217 09:25:17.951632 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db50eaa9-ca0a-4a83-98d8-fce82f849d91-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "db50eaa9-ca0a-4a83-98d8-fce82f849d91" (UID: "db50eaa9-ca0a-4a83-98d8-fce82f849d91"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:25:17 crc kubenswrapper[4848]: I0217 09:25:17.954986 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "persistence") pod "db50eaa9-ca0a-4a83-98d8-fce82f849d91" (UID: "db50eaa9-ca0a-4a83-98d8-fce82f849d91"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 09:25:17 crc kubenswrapper[4848]: I0217 09:25:17.955163 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db50eaa9-ca0a-4a83-98d8-fce82f849d91-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "db50eaa9-ca0a-4a83-98d8-fce82f849d91" (UID: "db50eaa9-ca0a-4a83-98d8-fce82f849d91"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:25:17 crc kubenswrapper[4848]: I0217 09:25:17.956426 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db50eaa9-ca0a-4a83-98d8-fce82f849d91-kube-api-access-xhcn9" (OuterVolumeSpecName: "kube-api-access-xhcn9") pod "db50eaa9-ca0a-4a83-98d8-fce82f849d91" (UID: "db50eaa9-ca0a-4a83-98d8-fce82f849d91"). InnerVolumeSpecName "kube-api-access-xhcn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:25:17 crc kubenswrapper[4848]: I0217 09:25:17.962998 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db50eaa9-ca0a-4a83-98d8-fce82f849d91-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "db50eaa9-ca0a-4a83-98d8-fce82f849d91" (UID: "db50eaa9-ca0a-4a83-98d8-fce82f849d91"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:25:17 crc kubenswrapper[4848]: I0217 09:25:17.987268 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/db50eaa9-ca0a-4a83-98d8-fce82f849d91-pod-info" (OuterVolumeSpecName: "pod-info") pod "db50eaa9-ca0a-4a83-98d8-fce82f849d91" (UID: "db50eaa9-ca0a-4a83-98d8-fce82f849d91"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.016528 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db50eaa9-ca0a-4a83-98d8-fce82f849d91-config-data" (OuterVolumeSpecName: "config-data") pod "db50eaa9-ca0a-4a83-98d8-fce82f849d91" (UID: "db50eaa9-ca0a-4a83-98d8-fce82f849d91"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.052190 4848 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/db50eaa9-ca0a-4a83-98d8-fce82f849d91-pod-info\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.052228 4848 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/db50eaa9-ca0a-4a83-98d8-fce82f849d91-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.052245 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhcn9\" (UniqueName: \"kubernetes.io/projected/db50eaa9-ca0a-4a83-98d8-fce82f849d91-kube-api-access-xhcn9\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.052279 4848 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.052291 4848 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/db50eaa9-ca0a-4a83-98d8-fce82f849d91-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.052302 4848 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/db50eaa9-ca0a-4a83-98d8-fce82f849d91-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.052314 4848 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/db50eaa9-ca0a-4a83-98d8-fce82f849d91-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.052326 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/db50eaa9-ca0a-4a83-98d8-fce82f849d91-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.052337 4848 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/db50eaa9-ca0a-4a83-98d8-fce82f849d91-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.061824 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db50eaa9-ca0a-4a83-98d8-fce82f849d91-server-conf" (OuterVolumeSpecName: "server-conf") pod "db50eaa9-ca0a-4a83-98d8-fce82f849d91" (UID: "db50eaa9-ca0a-4a83-98d8-fce82f849d91"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.077964 4848 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.102808 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db50eaa9-ca0a-4a83-98d8-fce82f849d91-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "db50eaa9-ca0a-4a83-98d8-fce82f849d91" (UID: "db50eaa9-ca0a-4a83-98d8-fce82f849d91"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.153478 4848 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/db50eaa9-ca0a-4a83-98d8-fce82f849d91-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.153696 4848 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/db50eaa9-ca0a-4a83-98d8-fce82f849d91-server-conf\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.153754 4848 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.600079 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.662050 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd7e9b9b-99f0-4720-b997-3f00996972e5-rabbitmq-erlang-cookie\") pod \"bd7e9b9b-99f0-4720-b997-3f00996972e5\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.662139 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59skc\" (UniqueName: \"kubernetes.io/projected/bd7e9b9b-99f0-4720-b997-3f00996972e5-kube-api-access-59skc\") pod \"bd7e9b9b-99f0-4720-b997-3f00996972e5\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.662213 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd7e9b9b-99f0-4720-b997-3f00996972e5-rabbitmq-plugins\") pod \"bd7e9b9b-99f0-4720-b997-3f00996972e5\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.662231 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd7e9b9b-99f0-4720-b997-3f00996972e5-server-conf\") pod \"bd7e9b9b-99f0-4720-b997-3f00996972e5\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.662249 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd7e9b9b-99f0-4720-b997-3f00996972e5-plugins-conf\") pod \"bd7e9b9b-99f0-4720-b997-3f00996972e5\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.662276 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bd7e9b9b-99f0-4720-b997-3f00996972e5-rabbitmq-tls\") pod \"bd7e9b9b-99f0-4720-b997-3f00996972e5\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.662325 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd7e9b9b-99f0-4720-b997-3f00996972e5-erlang-cookie-secret\") pod \"bd7e9b9b-99f0-4720-b997-3f00996972e5\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.662374 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd7e9b9b-99f0-4720-b997-3f00996972e5-pod-info\") pod \"bd7e9b9b-99f0-4720-b997-3f00996972e5\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.662398 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd7e9b9b-99f0-4720-b997-3f00996972e5-config-data\") pod \"bd7e9b9b-99f0-4720-b997-3f00996972e5\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.662416 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"bd7e9b9b-99f0-4720-b997-3f00996972e5\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.662450 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd7e9b9b-99f0-4720-b997-3f00996972e5-rabbitmq-confd\") pod \"bd7e9b9b-99f0-4720-b997-3f00996972e5\" (UID: \"bd7e9b9b-99f0-4720-b997-3f00996972e5\") " Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.662721 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd7e9b9b-99f0-4720-b997-3f00996972e5-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "bd7e9b9b-99f0-4720-b997-3f00996972e5" (UID: "bd7e9b9b-99f0-4720-b997-3f00996972e5"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.663075 4848 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/bd7e9b9b-99f0-4720-b997-3f00996972e5-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.664845 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd7e9b9b-99f0-4720-b997-3f00996972e5-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "bd7e9b9b-99f0-4720-b997-3f00996972e5" (UID: "bd7e9b9b-99f0-4720-b997-3f00996972e5"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.665081 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd7e9b9b-99f0-4720-b997-3f00996972e5-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "bd7e9b9b-99f0-4720-b997-3f00996972e5" (UID: "bd7e9b9b-99f0-4720-b997-3f00996972e5"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.670134 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd7e9b9b-99f0-4720-b997-3f00996972e5-kube-api-access-59skc" (OuterVolumeSpecName: "kube-api-access-59skc") pod "bd7e9b9b-99f0-4720-b997-3f00996972e5" (UID: "bd7e9b9b-99f0-4720-b997-3f00996972e5"). InnerVolumeSpecName "kube-api-access-59skc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.670843 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "bd7e9b9b-99f0-4720-b997-3f00996972e5" (UID: "bd7e9b9b-99f0-4720-b997-3f00996972e5"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.671955 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd7e9b9b-99f0-4720-b997-3f00996972e5-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "bd7e9b9b-99f0-4720-b997-3f00996972e5" (UID: "bd7e9b9b-99f0-4720-b997-3f00996972e5"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.672930 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd7e9b9b-99f0-4720-b997-3f00996972e5-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "bd7e9b9b-99f0-4720-b997-3f00996972e5" (UID: "bd7e9b9b-99f0-4720-b997-3f00996972e5"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.689865 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/bd7e9b9b-99f0-4720-b997-3f00996972e5-pod-info" (OuterVolumeSpecName: "pod-info") pod "bd7e9b9b-99f0-4720-b997-3f00996972e5" (UID: "bd7e9b9b-99f0-4720-b997-3f00996972e5"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.694375 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd7e9b9b-99f0-4720-b997-3f00996972e5-config-data" (OuterVolumeSpecName: "config-data") pod "bd7e9b9b-99f0-4720-b997-3f00996972e5" (UID: "bd7e9b9b-99f0-4720-b997-3f00996972e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.708211 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.708223 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"db50eaa9-ca0a-4a83-98d8-fce82f849d91","Type":"ContainerDied","Data":"3dfcbb0c9f17996275624b0f8bf1c1f2611bee778fca02953895dcb785168cee"} Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.708724 4848 scope.go:117] "RemoveContainer" containerID="89c4b7e4ebba0e5a926ad66e26e6c279acfd3e58d484e76f931ec2ced0b778c0" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.719887 4848 generic.go:334] "Generic (PLEG): container finished" podID="bd7e9b9b-99f0-4720-b997-3f00996972e5" containerID="db8066450874afb857859ed156351663263e471d9f54d2e2df0c4b84bad94d39" exitCode=0 Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.719931 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd7e9b9b-99f0-4720-b997-3f00996972e5","Type":"ContainerDied","Data":"db8066450874afb857859ed156351663263e471d9f54d2e2df0c4b84bad94d39"} Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.719956 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"bd7e9b9b-99f0-4720-b997-3f00996972e5","Type":"ContainerDied","Data":"c5f0ab0653b4b772a4a2cbfa5453ba966069e6e9f42663253bb2f5af996474ac"} Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.719955 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.760560 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd7e9b9b-99f0-4720-b997-3f00996972e5-server-conf" (OuterVolumeSpecName: "server-conf") pod "bd7e9b9b-99f0-4720-b997-3f00996972e5" (UID: "bd7e9b9b-99f0-4720-b997-3f00996972e5"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.765306 4848 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/bd7e9b9b-99f0-4720-b997-3f00996972e5-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.765340 4848 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/bd7e9b9b-99f0-4720-b997-3f00996972e5-server-conf\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.765352 4848 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/bd7e9b9b-99f0-4720-b997-3f00996972e5-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.765363 4848 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/bd7e9b9b-99f0-4720-b997-3f00996972e5-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.765373 4848 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/bd7e9b9b-99f0-4720-b997-3f00996972e5-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.765386 4848 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/bd7e9b9b-99f0-4720-b997-3f00996972e5-pod-info\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.765395 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bd7e9b9b-99f0-4720-b997-3f00996972e5-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.765434 4848 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.765449 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59skc\" (UniqueName: \"kubernetes.io/projected/bd7e9b9b-99f0-4720-b997-3f00996972e5-kube-api-access-59skc\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.772083 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.772136 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.790853 4848 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.796875 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd7e9b9b-99f0-4720-b997-3f00996972e5-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "bd7e9b9b-99f0-4720-b997-3f00996972e5" (UID: "bd7e9b9b-99f0-4720-b997-3f00996972e5"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.825339 4848 scope.go:117] "RemoveContainer" containerID="812daf3e743d62ecf6c5cde4e775fa48bf1f6c9b01408f073084719ba69e530f" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.839827 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.853189 4848 scope.go:117] "RemoveContainer" containerID="db8066450874afb857859ed156351663263e471d9f54d2e2df0c4b84bad94d39" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.855985 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.862312 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 09:25:18 crc kubenswrapper[4848]: E0217 09:25:18.862809 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db50eaa9-ca0a-4a83-98d8-fce82f849d91" containerName="setup-container" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.862875 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="db50eaa9-ca0a-4a83-98d8-fce82f849d91" containerName="setup-container" Feb 17 09:25:18 crc kubenswrapper[4848]: E0217 09:25:18.862937 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd7e9b9b-99f0-4720-b997-3f00996972e5" containerName="setup-container" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.862987 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd7e9b9b-99f0-4720-b997-3f00996972e5" containerName="setup-container" Feb 17 09:25:18 crc kubenswrapper[4848]: E0217 09:25:18.863083 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db50eaa9-ca0a-4a83-98d8-fce82f849d91" containerName="rabbitmq" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.863144 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="db50eaa9-ca0a-4a83-98d8-fce82f849d91" containerName="rabbitmq" Feb 17 09:25:18 crc kubenswrapper[4848]: E0217 09:25:18.863201 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd7e9b9b-99f0-4720-b997-3f00996972e5" containerName="rabbitmq" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.863250 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd7e9b9b-99f0-4720-b997-3f00996972e5" containerName="rabbitmq" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.863551 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="db50eaa9-ca0a-4a83-98d8-fce82f849d91" containerName="rabbitmq" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.863619 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd7e9b9b-99f0-4720-b997-3f00996972e5" containerName="rabbitmq" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.864574 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.866481 4848 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.866623 4848 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/bd7e9b9b-99f0-4720-b997-3f00996972e5-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.867892 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-sbvg9" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.868356 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.868544 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.870485 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.870562 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.870660 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.870722 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.882569 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.890893 4848 scope.go:117] "RemoveContainer" containerID="ae0120d0e18108904cbf2f561462654c7c922d6913d36d2d77ef7a061136de71" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.918102 4848 scope.go:117] "RemoveContainer" containerID="db8066450874afb857859ed156351663263e471d9f54d2e2df0c4b84bad94d39" Feb 17 09:25:18 crc kubenswrapper[4848]: E0217 09:25:18.918619 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db8066450874afb857859ed156351663263e471d9f54d2e2df0c4b84bad94d39\": container with ID starting with db8066450874afb857859ed156351663263e471d9f54d2e2df0c4b84bad94d39 not found: ID does not exist" containerID="db8066450874afb857859ed156351663263e471d9f54d2e2df0c4b84bad94d39" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.918648 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db8066450874afb857859ed156351663263e471d9f54d2e2df0c4b84bad94d39"} err="failed to get container status \"db8066450874afb857859ed156351663263e471d9f54d2e2df0c4b84bad94d39\": rpc error: code = NotFound desc = could not find container \"db8066450874afb857859ed156351663263e471d9f54d2e2df0c4b84bad94d39\": container with ID starting with db8066450874afb857859ed156351663263e471d9f54d2e2df0c4b84bad94d39 not found: ID does not exist" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.918669 4848 scope.go:117] "RemoveContainer" containerID="ae0120d0e18108904cbf2f561462654c7c922d6913d36d2d77ef7a061136de71" Feb 17 09:25:18 crc kubenswrapper[4848]: E0217 09:25:18.919351 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae0120d0e18108904cbf2f561462654c7c922d6913d36d2d77ef7a061136de71\": container with ID starting with ae0120d0e18108904cbf2f561462654c7c922d6913d36d2d77ef7a061136de71 not found: ID does not exist" containerID="ae0120d0e18108904cbf2f561462654c7c922d6913d36d2d77ef7a061136de71" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.919377 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae0120d0e18108904cbf2f561462654c7c922d6913d36d2d77ef7a061136de71"} err="failed to get container status \"ae0120d0e18108904cbf2f561462654c7c922d6913d36d2d77ef7a061136de71\": rpc error: code = NotFound desc = could not find container \"ae0120d0e18108904cbf2f561462654c7c922d6913d36d2d77ef7a061136de71\": container with ID starting with ae0120d0e18108904cbf2f561462654c7c922d6913d36d2d77ef7a061136de71 not found: ID does not exist" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.967912 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/30e48298-cbbd-4637-83a9-733efaaf0756-server-conf\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.967977 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/30e48298-cbbd-4637-83a9-733efaaf0756-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.968021 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/30e48298-cbbd-4637-83a9-733efaaf0756-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.968047 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/30e48298-cbbd-4637-83a9-733efaaf0756-pod-info\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.968076 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.968097 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45h72\" (UniqueName: \"kubernetes.io/projected/30e48298-cbbd-4637-83a9-733efaaf0756-kube-api-access-45h72\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.968314 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/30e48298-cbbd-4637-83a9-733efaaf0756-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.968334 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/30e48298-cbbd-4637-83a9-733efaaf0756-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.968366 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30e48298-cbbd-4637-83a9-733efaaf0756-config-data\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.968383 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/30e48298-cbbd-4637-83a9-733efaaf0756-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:18 crc kubenswrapper[4848]: I0217 09:25:18.968401 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/30e48298-cbbd-4637-83a9-733efaaf0756-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.051707 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.060452 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.069630 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45h72\" (UniqueName: \"kubernetes.io/projected/30e48298-cbbd-4637-83a9-733efaaf0756-kube-api-access-45h72\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.069700 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/30e48298-cbbd-4637-83a9-733efaaf0756-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.069731 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/30e48298-cbbd-4637-83a9-733efaaf0756-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.069803 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30e48298-cbbd-4637-83a9-733efaaf0756-config-data\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.069829 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/30e48298-cbbd-4637-83a9-733efaaf0756-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.069853 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/30e48298-cbbd-4637-83a9-733efaaf0756-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.069932 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/30e48298-cbbd-4637-83a9-733efaaf0756-server-conf\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.069985 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/30e48298-cbbd-4637-83a9-733efaaf0756-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.070034 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/30e48298-cbbd-4637-83a9-733efaaf0756-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.070060 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/30e48298-cbbd-4637-83a9-733efaaf0756-pod-info\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.070097 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.070262 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.071279 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/30e48298-cbbd-4637-83a9-733efaaf0756-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.071740 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/30e48298-cbbd-4637-83a9-733efaaf0756-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.071948 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/30e48298-cbbd-4637-83a9-733efaaf0756-server-conf\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.072103 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/30e48298-cbbd-4637-83a9-733efaaf0756-config-data\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.072182 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/30e48298-cbbd-4637-83a9-733efaaf0756-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.074935 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/30e48298-cbbd-4637-83a9-733efaaf0756-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.074937 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/30e48298-cbbd-4637-83a9-733efaaf0756-pod-info\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.079938 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/30e48298-cbbd-4637-83a9-733efaaf0756-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.080068 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/30e48298-cbbd-4637-83a9-733efaaf0756-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.089708 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.091943 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.094095 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-vsrr6" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.095557 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.096417 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.096808 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.097129 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.097377 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.097787 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.106516 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45h72\" (UniqueName: \"kubernetes.io/projected/30e48298-cbbd-4637-83a9-733efaaf0756-kube-api-access-45h72\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.108158 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.110144 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-server-0\" (UID: \"30e48298-cbbd-4637-83a9-733efaaf0756\") " pod="openstack/rabbitmq-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.171571 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.171872 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.171973 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.172046 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.172158 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.172233 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.172308 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.172420 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkwvh\" (UniqueName: \"kubernetes.io/projected/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-kube-api-access-tkwvh\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.172506 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.172577 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.172676 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.204191 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.274957 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.275094 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.275134 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.275167 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.275190 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.275239 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.275274 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.275306 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.275371 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkwvh\" (UniqueName: \"kubernetes.io/projected/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-kube-api-access-tkwvh\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.275406 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.275427 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.275967 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.276293 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.277140 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.279013 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.279159 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.279232 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.279598 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.280114 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.295672 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.299547 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.312994 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkwvh\" (UniqueName: \"kubernetes.io/projected/b5fc69cf-e25e-4ca3-adc3-36b1678691e1-kube-api-access-tkwvh\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.328401 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b5fc69cf-e25e-4ca3-adc3-36b1678691e1\") " pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.405125 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd7e9b9b-99f0-4720-b997-3f00996972e5" path="/var/lib/kubelet/pods/bd7e9b9b-99f0-4720-b997-3f00996972e5/volumes" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.406730 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db50eaa9-ca0a-4a83-98d8-fce82f849d91" path="/var/lib/kubelet/pods/db50eaa9-ca0a-4a83-98d8-fce82f849d91/volumes" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.447491 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.644723 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.734263 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"30e48298-cbbd-4637-83a9-733efaaf0756","Type":"ContainerStarted","Data":"881107a94da1c3f8d839960a073c54c2511ec27b659c08509eb6e563b25f67b7"} Feb 17 09:25:19 crc kubenswrapper[4848]: I0217 09:25:19.908080 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 17 09:25:19 crc kubenswrapper[4848]: W0217 09:25:19.913023 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5fc69cf_e25e_4ca3_adc3_36b1678691e1.slice/crio-9bee5e9c576a9c629083976b9e940b913d2bac7ff0175808c903774e7fe81b1b WatchSource:0}: Error finding container 9bee5e9c576a9c629083976b9e940b913d2bac7ff0175808c903774e7fe81b1b: Status 404 returned error can't find the container with id 9bee5e9c576a9c629083976b9e940b913d2bac7ff0175808c903774e7fe81b1b Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.357125 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ddf55cb67-wpvmv"] Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.364694 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.371647 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.394078 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ddf55cb67-wpvmv"] Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.504517 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddf55cb67-wpvmv\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.504915 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddf55cb67-wpvmv\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.504983 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-dns-swift-storage-0\") pod \"dnsmasq-dns-5ddf55cb67-wpvmv\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.505062 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-dns-svc\") pod \"dnsmasq-dns-5ddf55cb67-wpvmv\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.505153 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-config\") pod \"dnsmasq-dns-5ddf55cb67-wpvmv\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.505375 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-openstack-edpm-ipam\") pod \"dnsmasq-dns-5ddf55cb67-wpvmv\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.505535 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mw8s5\" (UniqueName: \"kubernetes.io/projected/4e1d09ea-fbdf-4703-8a13-e7a86b588372-kube-api-access-mw8s5\") pod \"dnsmasq-dns-5ddf55cb67-wpvmv\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.606853 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-dns-svc\") pod \"dnsmasq-dns-5ddf55cb67-wpvmv\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.606899 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-config\") pod \"dnsmasq-dns-5ddf55cb67-wpvmv\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.606959 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-openstack-edpm-ipam\") pod \"dnsmasq-dns-5ddf55cb67-wpvmv\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.607022 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mw8s5\" (UniqueName: \"kubernetes.io/projected/4e1d09ea-fbdf-4703-8a13-e7a86b588372-kube-api-access-mw8s5\") pod \"dnsmasq-dns-5ddf55cb67-wpvmv\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.607092 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddf55cb67-wpvmv\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.607186 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddf55cb67-wpvmv\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.607245 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-dns-swift-storage-0\") pod \"dnsmasq-dns-5ddf55cb67-wpvmv\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.607786 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-dns-svc\") pod \"dnsmasq-dns-5ddf55cb67-wpvmv\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.607895 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-config\") pod \"dnsmasq-dns-5ddf55cb67-wpvmv\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.608229 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-ovsdbserver-nb\") pod \"dnsmasq-dns-5ddf55cb67-wpvmv\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.608503 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-dns-swift-storage-0\") pod \"dnsmasq-dns-5ddf55cb67-wpvmv\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.608615 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-ovsdbserver-sb\") pod \"dnsmasq-dns-5ddf55cb67-wpvmv\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.608864 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-openstack-edpm-ipam\") pod \"dnsmasq-dns-5ddf55cb67-wpvmv\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.686036 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mw8s5\" (UniqueName: \"kubernetes.io/projected/4e1d09ea-fbdf-4703-8a13-e7a86b588372-kube-api-access-mw8s5\") pod \"dnsmasq-dns-5ddf55cb67-wpvmv\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.753043 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b5fc69cf-e25e-4ca3-adc3-36b1678691e1","Type":"ContainerStarted","Data":"9bee5e9c576a9c629083976b9e940b913d2bac7ff0175808c903774e7fe81b1b"} Feb 17 09:25:20 crc kubenswrapper[4848]: I0217 09:25:20.986516 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:21 crc kubenswrapper[4848]: I0217 09:25:21.296580 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ddf55cb67-wpvmv"] Feb 17 09:25:21 crc kubenswrapper[4848]: I0217 09:25:21.776856 4848 generic.go:334] "Generic (PLEG): container finished" podID="4e1d09ea-fbdf-4703-8a13-e7a86b588372" containerID="e65086047553c1146146e2c1f45add3a423efb8e2b7619637aebac4c49c9893a" exitCode=0 Feb 17 09:25:21 crc kubenswrapper[4848]: I0217 09:25:21.777059 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" event={"ID":"4e1d09ea-fbdf-4703-8a13-e7a86b588372","Type":"ContainerDied","Data":"e65086047553c1146146e2c1f45add3a423efb8e2b7619637aebac4c49c9893a"} Feb 17 09:25:21 crc kubenswrapper[4848]: I0217 09:25:21.777196 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" event={"ID":"4e1d09ea-fbdf-4703-8a13-e7a86b588372","Type":"ContainerStarted","Data":"a0fa9b4042c6c21284bfae3bcc2c34e7042eca09ca4575862c3ae4d1c2efdfec"} Feb 17 09:25:21 crc kubenswrapper[4848]: I0217 09:25:21.780606 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"30e48298-cbbd-4637-83a9-733efaaf0756","Type":"ContainerStarted","Data":"85835f82e3fb542c8d6464a4cefe9bebff559dd555b1c01f49ba1ee3522d033d"} Feb 17 09:25:21 crc kubenswrapper[4848]: I0217 09:25:21.783469 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b5fc69cf-e25e-4ca3-adc3-36b1678691e1","Type":"ContainerStarted","Data":"5423e28af991c9526bfe60fa57952d519fa289d42468ee1296a6fcfa19a63fe1"} Feb 17 09:25:22 crc kubenswrapper[4848]: I0217 09:25:22.795794 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" event={"ID":"4e1d09ea-fbdf-4703-8a13-e7a86b588372","Type":"ContainerStarted","Data":"20c623148dc04029f1861be7041d7d8eab6027744e8755f708af331c12fa21ce"} Feb 17 09:25:22 crc kubenswrapper[4848]: I0217 09:25:22.796395 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:22 crc kubenswrapper[4848]: I0217 09:25:22.832326 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" podStartSLOduration=2.832309918 podStartE2EDuration="2.832309918s" podCreationTimestamp="2026-02-17 09:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:25:22.824192306 +0000 UTC m=+1200.367447992" watchObservedRunningTime="2026-02-17 09:25:22.832309918 +0000 UTC m=+1200.375565564" Feb 17 09:25:30 crc kubenswrapper[4848]: I0217 09:25:30.989038 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.067314 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7dcd758995-gt9wf"] Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.068043 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" podUID="87968358-75c5-4fb2-b3b1-8cf4e806611d" containerName="dnsmasq-dns" containerID="cri-o://812f20de746cd482b51b05aa2e429d52f314d3c43b6d7ba59ee7e82afbfcf845" gracePeriod=10 Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.262209 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b9df5dcdc-8rdwv"] Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.263658 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.285519 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b9df5dcdc-8rdwv"] Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.382446 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84c45378-b510-419e-83b7-b92a19292d39-ovsdbserver-sb\") pod \"dnsmasq-dns-b9df5dcdc-8rdwv\" (UID: \"84c45378-b510-419e-83b7-b92a19292d39\") " pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.382510 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84c45378-b510-419e-83b7-b92a19292d39-ovsdbserver-nb\") pod \"dnsmasq-dns-b9df5dcdc-8rdwv\" (UID: \"84c45378-b510-419e-83b7-b92a19292d39\") " pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.382551 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjpxq\" (UniqueName: \"kubernetes.io/projected/84c45378-b510-419e-83b7-b92a19292d39-kube-api-access-kjpxq\") pod \"dnsmasq-dns-b9df5dcdc-8rdwv\" (UID: \"84c45378-b510-419e-83b7-b92a19292d39\") " pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.382580 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/84c45378-b510-419e-83b7-b92a19292d39-openstack-edpm-ipam\") pod \"dnsmasq-dns-b9df5dcdc-8rdwv\" (UID: \"84c45378-b510-419e-83b7-b92a19292d39\") " pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.382859 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/84c45378-b510-419e-83b7-b92a19292d39-dns-swift-storage-0\") pod \"dnsmasq-dns-b9df5dcdc-8rdwv\" (UID: \"84c45378-b510-419e-83b7-b92a19292d39\") " pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.383042 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84c45378-b510-419e-83b7-b92a19292d39-config\") pod \"dnsmasq-dns-b9df5dcdc-8rdwv\" (UID: \"84c45378-b510-419e-83b7-b92a19292d39\") " pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.383094 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84c45378-b510-419e-83b7-b92a19292d39-dns-svc\") pod \"dnsmasq-dns-b9df5dcdc-8rdwv\" (UID: \"84c45378-b510-419e-83b7-b92a19292d39\") " pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.485473 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/84c45378-b510-419e-83b7-b92a19292d39-dns-swift-storage-0\") pod \"dnsmasq-dns-b9df5dcdc-8rdwv\" (UID: \"84c45378-b510-419e-83b7-b92a19292d39\") " pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.485551 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84c45378-b510-419e-83b7-b92a19292d39-config\") pod \"dnsmasq-dns-b9df5dcdc-8rdwv\" (UID: \"84c45378-b510-419e-83b7-b92a19292d39\") " pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.485579 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84c45378-b510-419e-83b7-b92a19292d39-dns-svc\") pod \"dnsmasq-dns-b9df5dcdc-8rdwv\" (UID: \"84c45378-b510-419e-83b7-b92a19292d39\") " pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.485614 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84c45378-b510-419e-83b7-b92a19292d39-ovsdbserver-sb\") pod \"dnsmasq-dns-b9df5dcdc-8rdwv\" (UID: \"84c45378-b510-419e-83b7-b92a19292d39\") " pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.485671 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84c45378-b510-419e-83b7-b92a19292d39-ovsdbserver-nb\") pod \"dnsmasq-dns-b9df5dcdc-8rdwv\" (UID: \"84c45378-b510-419e-83b7-b92a19292d39\") " pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.486178 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjpxq\" (UniqueName: \"kubernetes.io/projected/84c45378-b510-419e-83b7-b92a19292d39-kube-api-access-kjpxq\") pod \"dnsmasq-dns-b9df5dcdc-8rdwv\" (UID: \"84c45378-b510-419e-83b7-b92a19292d39\") " pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.486552 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/84c45378-b510-419e-83b7-b92a19292d39-dns-svc\") pod \"dnsmasq-dns-b9df5dcdc-8rdwv\" (UID: \"84c45378-b510-419e-83b7-b92a19292d39\") " pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.486711 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/84c45378-b510-419e-83b7-b92a19292d39-config\") pod \"dnsmasq-dns-b9df5dcdc-8rdwv\" (UID: \"84c45378-b510-419e-83b7-b92a19292d39\") " pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.486781 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/84c45378-b510-419e-83b7-b92a19292d39-ovsdbserver-nb\") pod \"dnsmasq-dns-b9df5dcdc-8rdwv\" (UID: \"84c45378-b510-419e-83b7-b92a19292d39\") " pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.486850 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/84c45378-b510-419e-83b7-b92a19292d39-openstack-edpm-ipam\") pod \"dnsmasq-dns-b9df5dcdc-8rdwv\" (UID: \"84c45378-b510-419e-83b7-b92a19292d39\") " pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.487011 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/84c45378-b510-419e-83b7-b92a19292d39-ovsdbserver-sb\") pod \"dnsmasq-dns-b9df5dcdc-8rdwv\" (UID: \"84c45378-b510-419e-83b7-b92a19292d39\") " pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.487059 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/84c45378-b510-419e-83b7-b92a19292d39-dns-swift-storage-0\") pod \"dnsmasq-dns-b9df5dcdc-8rdwv\" (UID: \"84c45378-b510-419e-83b7-b92a19292d39\") " pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.487571 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/84c45378-b510-419e-83b7-b92a19292d39-openstack-edpm-ipam\") pod \"dnsmasq-dns-b9df5dcdc-8rdwv\" (UID: \"84c45378-b510-419e-83b7-b92a19292d39\") " pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.512323 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjpxq\" (UniqueName: \"kubernetes.io/projected/84c45378-b510-419e-83b7-b92a19292d39-kube-api-access-kjpxq\") pod \"dnsmasq-dns-b9df5dcdc-8rdwv\" (UID: \"84c45378-b510-419e-83b7-b92a19292d39\") " pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.593218 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.603242 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.690318 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9drct\" (UniqueName: \"kubernetes.io/projected/87968358-75c5-4fb2-b3b1-8cf4e806611d-kube-api-access-9drct\") pod \"87968358-75c5-4fb2-b3b1-8cf4e806611d\" (UID: \"87968358-75c5-4fb2-b3b1-8cf4e806611d\") " Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.691525 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-dns-svc\") pod \"87968358-75c5-4fb2-b3b1-8cf4e806611d\" (UID: \"87968358-75c5-4fb2-b3b1-8cf4e806611d\") " Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.692027 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-dns-swift-storage-0\") pod \"87968358-75c5-4fb2-b3b1-8cf4e806611d\" (UID: \"87968358-75c5-4fb2-b3b1-8cf4e806611d\") " Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.692221 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-ovsdbserver-nb\") pod \"87968358-75c5-4fb2-b3b1-8cf4e806611d\" (UID: \"87968358-75c5-4fb2-b3b1-8cf4e806611d\") " Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.692372 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-ovsdbserver-sb\") pod \"87968358-75c5-4fb2-b3b1-8cf4e806611d\" (UID: \"87968358-75c5-4fb2-b3b1-8cf4e806611d\") " Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.692598 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-config\") pod \"87968358-75c5-4fb2-b3b1-8cf4e806611d\" (UID: \"87968358-75c5-4fb2-b3b1-8cf4e806611d\") " Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.701016 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87968358-75c5-4fb2-b3b1-8cf4e806611d-kube-api-access-9drct" (OuterVolumeSpecName: "kube-api-access-9drct") pod "87968358-75c5-4fb2-b3b1-8cf4e806611d" (UID: "87968358-75c5-4fb2-b3b1-8cf4e806611d"). InnerVolumeSpecName "kube-api-access-9drct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.758802 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "87968358-75c5-4fb2-b3b1-8cf4e806611d" (UID: "87968358-75c5-4fb2-b3b1-8cf4e806611d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.759461 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-config" (OuterVolumeSpecName: "config") pod "87968358-75c5-4fb2-b3b1-8cf4e806611d" (UID: "87968358-75c5-4fb2-b3b1-8cf4e806611d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.762880 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "87968358-75c5-4fb2-b3b1-8cf4e806611d" (UID: "87968358-75c5-4fb2-b3b1-8cf4e806611d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.767427 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "87968358-75c5-4fb2-b3b1-8cf4e806611d" (UID: "87968358-75c5-4fb2-b3b1-8cf4e806611d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.782467 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "87968358-75c5-4fb2-b3b1-8cf4e806611d" (UID: "87968358-75c5-4fb2-b3b1-8cf4e806611d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.795309 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9drct\" (UniqueName: \"kubernetes.io/projected/87968358-75c5-4fb2-b3b1-8cf4e806611d-kube-api-access-9drct\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.795347 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.795360 4848 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.795368 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.795377 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.795384 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/87968358-75c5-4fb2-b3b1-8cf4e806611d-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.908009 4848 generic.go:334] "Generic (PLEG): container finished" podID="87968358-75c5-4fb2-b3b1-8cf4e806611d" containerID="812f20de746cd482b51b05aa2e429d52f314d3c43b6d7ba59ee7e82afbfcf845" exitCode=0 Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.908050 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" event={"ID":"87968358-75c5-4fb2-b3b1-8cf4e806611d","Type":"ContainerDied","Data":"812f20de746cd482b51b05aa2e429d52f314d3c43b6d7ba59ee7e82afbfcf845"} Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.908079 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" event={"ID":"87968358-75c5-4fb2-b3b1-8cf4e806611d","Type":"ContainerDied","Data":"2277c6a4a4e0b718cf9a65c4d061c702159c6ce7a0a6769b4aad8a37ca551a73"} Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.908095 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7dcd758995-gt9wf" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.908097 4848 scope.go:117] "RemoveContainer" containerID="812f20de746cd482b51b05aa2e429d52f314d3c43b6d7ba59ee7e82afbfcf845" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.928982 4848 scope.go:117] "RemoveContainer" containerID="5c45d6a87fa7bb0068915bc14d1a6726f5eb3410cb0516f44364adadbd1668ed" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.947868 4848 scope.go:117] "RemoveContainer" containerID="812f20de746cd482b51b05aa2e429d52f314d3c43b6d7ba59ee7e82afbfcf845" Feb 17 09:25:31 crc kubenswrapper[4848]: E0217 09:25:31.948337 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"812f20de746cd482b51b05aa2e429d52f314d3c43b6d7ba59ee7e82afbfcf845\": container with ID starting with 812f20de746cd482b51b05aa2e429d52f314d3c43b6d7ba59ee7e82afbfcf845 not found: ID does not exist" containerID="812f20de746cd482b51b05aa2e429d52f314d3c43b6d7ba59ee7e82afbfcf845" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.948424 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"812f20de746cd482b51b05aa2e429d52f314d3c43b6d7ba59ee7e82afbfcf845"} err="failed to get container status \"812f20de746cd482b51b05aa2e429d52f314d3c43b6d7ba59ee7e82afbfcf845\": rpc error: code = NotFound desc = could not find container \"812f20de746cd482b51b05aa2e429d52f314d3c43b6d7ba59ee7e82afbfcf845\": container with ID starting with 812f20de746cd482b51b05aa2e429d52f314d3c43b6d7ba59ee7e82afbfcf845 not found: ID does not exist" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.948459 4848 scope.go:117] "RemoveContainer" containerID="5c45d6a87fa7bb0068915bc14d1a6726f5eb3410cb0516f44364adadbd1668ed" Feb 17 09:25:31 crc kubenswrapper[4848]: E0217 09:25:31.948846 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c45d6a87fa7bb0068915bc14d1a6726f5eb3410cb0516f44364adadbd1668ed\": container with ID starting with 5c45d6a87fa7bb0068915bc14d1a6726f5eb3410cb0516f44364adadbd1668ed not found: ID does not exist" containerID="5c45d6a87fa7bb0068915bc14d1a6726f5eb3410cb0516f44364adadbd1668ed" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.949055 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c45d6a87fa7bb0068915bc14d1a6726f5eb3410cb0516f44364adadbd1668ed"} err="failed to get container status \"5c45d6a87fa7bb0068915bc14d1a6726f5eb3410cb0516f44364adadbd1668ed\": rpc error: code = NotFound desc = could not find container \"5c45d6a87fa7bb0068915bc14d1a6726f5eb3410cb0516f44364adadbd1668ed\": container with ID starting with 5c45d6a87fa7bb0068915bc14d1a6726f5eb3410cb0516f44364adadbd1668ed not found: ID does not exist" Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.950902 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7dcd758995-gt9wf"] Feb 17 09:25:31 crc kubenswrapper[4848]: I0217 09:25:31.958593 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7dcd758995-gt9wf"] Feb 17 09:25:32 crc kubenswrapper[4848]: I0217 09:25:32.108278 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b9df5dcdc-8rdwv"] Feb 17 09:25:32 crc kubenswrapper[4848]: I0217 09:25:32.919248 4848 generic.go:334] "Generic (PLEG): container finished" podID="84c45378-b510-419e-83b7-b92a19292d39" containerID="4c65785997f2714744bf7f28cd37dfe4e85a9f0aae1f8bdcc9affa9a6ac33916" exitCode=0 Feb 17 09:25:32 crc kubenswrapper[4848]: I0217 09:25:32.919297 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" event={"ID":"84c45378-b510-419e-83b7-b92a19292d39","Type":"ContainerDied","Data":"4c65785997f2714744bf7f28cd37dfe4e85a9f0aae1f8bdcc9affa9a6ac33916"} Feb 17 09:25:32 crc kubenswrapper[4848]: I0217 09:25:32.919633 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" event={"ID":"84c45378-b510-419e-83b7-b92a19292d39","Type":"ContainerStarted","Data":"7ea00b0dd24f4dc1ac3073ee19a7bb5afbd8938d781e67cd0d0bef911b2bd37d"} Feb 17 09:25:33 crc kubenswrapper[4848]: I0217 09:25:33.400790 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87968358-75c5-4fb2-b3b1-8cf4e806611d" path="/var/lib/kubelet/pods/87968358-75c5-4fb2-b3b1-8cf4e806611d/volumes" Feb 17 09:25:33 crc kubenswrapper[4848]: I0217 09:25:33.950918 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" event={"ID":"84c45378-b510-419e-83b7-b92a19292d39","Type":"ContainerStarted","Data":"7e858a11d61303d0c197f846e5ebc5024612d3f3823ca93181e82b265e01a17d"} Feb 17 09:25:33 crc kubenswrapper[4848]: I0217 09:25:33.952060 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" Feb 17 09:25:33 crc kubenswrapper[4848]: I0217 09:25:33.989624 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" podStartSLOduration=2.989592659 podStartE2EDuration="2.989592659s" podCreationTimestamp="2026-02-17 09:25:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:25:33.977894665 +0000 UTC m=+1211.521150361" watchObservedRunningTime="2026-02-17 09:25:33.989592659 +0000 UTC m=+1211.532848345" Feb 17 09:25:41 crc kubenswrapper[4848]: I0217 09:25:41.595582 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b9df5dcdc-8rdwv" Feb 17 09:25:41 crc kubenswrapper[4848]: I0217 09:25:41.688007 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ddf55cb67-wpvmv"] Feb 17 09:25:41 crc kubenswrapper[4848]: I0217 09:25:41.688744 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" podUID="4e1d09ea-fbdf-4703-8a13-e7a86b588372" containerName="dnsmasq-dns" containerID="cri-o://20c623148dc04029f1861be7041d7d8eab6027744e8755f708af331c12fa21ce" gracePeriod=10 Feb 17 09:25:42 crc kubenswrapper[4848]: I0217 09:25:42.043746 4848 generic.go:334] "Generic (PLEG): container finished" podID="4e1d09ea-fbdf-4703-8a13-e7a86b588372" containerID="20c623148dc04029f1861be7041d7d8eab6027744e8755f708af331c12fa21ce" exitCode=0 Feb 17 09:25:42 crc kubenswrapper[4848]: I0217 09:25:42.043793 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" event={"ID":"4e1d09ea-fbdf-4703-8a13-e7a86b588372","Type":"ContainerDied","Data":"20c623148dc04029f1861be7041d7d8eab6027744e8755f708af331c12fa21ce"} Feb 17 09:25:42 crc kubenswrapper[4848]: I0217 09:25:42.160735 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:42 crc kubenswrapper[4848]: I0217 09:25:42.337569 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-dns-swift-storage-0\") pod \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " Feb 17 09:25:42 crc kubenswrapper[4848]: I0217 09:25:42.337629 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mw8s5\" (UniqueName: \"kubernetes.io/projected/4e1d09ea-fbdf-4703-8a13-e7a86b588372-kube-api-access-mw8s5\") pod \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " Feb 17 09:25:42 crc kubenswrapper[4848]: I0217 09:25:42.337717 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-config\") pod \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " Feb 17 09:25:42 crc kubenswrapper[4848]: I0217 09:25:42.337798 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-ovsdbserver-nb\") pod \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " Feb 17 09:25:42 crc kubenswrapper[4848]: I0217 09:25:42.337887 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-openstack-edpm-ipam\") pod \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " Feb 17 09:25:42 crc kubenswrapper[4848]: I0217 09:25:42.337938 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-ovsdbserver-sb\") pod \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " Feb 17 09:25:42 crc kubenswrapper[4848]: I0217 09:25:42.337992 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-dns-svc\") pod \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\" (UID: \"4e1d09ea-fbdf-4703-8a13-e7a86b588372\") " Feb 17 09:25:42 crc kubenswrapper[4848]: I0217 09:25:42.352975 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e1d09ea-fbdf-4703-8a13-e7a86b588372-kube-api-access-mw8s5" (OuterVolumeSpecName: "kube-api-access-mw8s5") pod "4e1d09ea-fbdf-4703-8a13-e7a86b588372" (UID: "4e1d09ea-fbdf-4703-8a13-e7a86b588372"). InnerVolumeSpecName "kube-api-access-mw8s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:25:42 crc kubenswrapper[4848]: I0217 09:25:42.389813 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4e1d09ea-fbdf-4703-8a13-e7a86b588372" (UID: "4e1d09ea-fbdf-4703-8a13-e7a86b588372"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:25:42 crc kubenswrapper[4848]: I0217 09:25:42.393118 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4e1d09ea-fbdf-4703-8a13-e7a86b588372" (UID: "4e1d09ea-fbdf-4703-8a13-e7a86b588372"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:25:42 crc kubenswrapper[4848]: I0217 09:25:42.397591 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "4e1d09ea-fbdf-4703-8a13-e7a86b588372" (UID: "4e1d09ea-fbdf-4703-8a13-e7a86b588372"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:25:42 crc kubenswrapper[4848]: I0217 09:25:42.399398 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4e1d09ea-fbdf-4703-8a13-e7a86b588372" (UID: "4e1d09ea-fbdf-4703-8a13-e7a86b588372"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:25:42 crc kubenswrapper[4848]: I0217 09:25:42.400394 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-config" (OuterVolumeSpecName: "config") pod "4e1d09ea-fbdf-4703-8a13-e7a86b588372" (UID: "4e1d09ea-fbdf-4703-8a13-e7a86b588372"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:25:42 crc kubenswrapper[4848]: I0217 09:25:42.406696 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4e1d09ea-fbdf-4703-8a13-e7a86b588372" (UID: "4e1d09ea-fbdf-4703-8a13-e7a86b588372"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:25:42 crc kubenswrapper[4848]: I0217 09:25:42.440445 4848 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:42 crc kubenswrapper[4848]: I0217 09:25:42.440484 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mw8s5\" (UniqueName: \"kubernetes.io/projected/4e1d09ea-fbdf-4703-8a13-e7a86b588372-kube-api-access-mw8s5\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:42 crc kubenswrapper[4848]: I0217 09:25:42.440498 4848 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:42 crc kubenswrapper[4848]: I0217 09:25:42.440511 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:42 crc kubenswrapper[4848]: I0217 09:25:42.440522 4848 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:42 crc kubenswrapper[4848]: I0217 09:25:42.440530 4848 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:42 crc kubenswrapper[4848]: I0217 09:25:42.440540 4848 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4e1d09ea-fbdf-4703-8a13-e7a86b588372-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 17 09:25:43 crc kubenswrapper[4848]: I0217 09:25:43.055892 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" event={"ID":"4e1d09ea-fbdf-4703-8a13-e7a86b588372","Type":"ContainerDied","Data":"a0fa9b4042c6c21284bfae3bcc2c34e7042eca09ca4575862c3ae4d1c2efdfec"} Feb 17 09:25:43 crc kubenswrapper[4848]: I0217 09:25:43.056351 4848 scope.go:117] "RemoveContainer" containerID="20c623148dc04029f1861be7041d7d8eab6027744e8755f708af331c12fa21ce" Feb 17 09:25:43 crc kubenswrapper[4848]: I0217 09:25:43.056026 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ddf55cb67-wpvmv" Feb 17 09:25:43 crc kubenswrapper[4848]: I0217 09:25:43.109879 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ddf55cb67-wpvmv"] Feb 17 09:25:43 crc kubenswrapper[4848]: I0217 09:25:43.114168 4848 scope.go:117] "RemoveContainer" containerID="e65086047553c1146146e2c1f45add3a423efb8e2b7619637aebac4c49c9893a" Feb 17 09:25:43 crc kubenswrapper[4848]: I0217 09:25:43.119277 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ddf55cb67-wpvmv"] Feb 17 09:25:43 crc kubenswrapper[4848]: I0217 09:25:43.438209 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e1d09ea-fbdf-4703-8a13-e7a86b588372" path="/var/lib/kubelet/pods/4e1d09ea-fbdf-4703-8a13-e7a86b588372/volumes" Feb 17 09:25:48 crc kubenswrapper[4848]: I0217 09:25:48.771984 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:25:48 crc kubenswrapper[4848]: I0217 09:25:48.772588 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:25:53 crc kubenswrapper[4848]: E0217 09:25:53.996085 4848 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5fc69cf_e25e_4ca3_adc3_36b1678691e1.slice/crio-5423e28af991c9526bfe60fa57952d519fa289d42468ee1296a6fcfa19a63fe1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5fc69cf_e25e_4ca3_adc3_36b1678691e1.slice/crio-conmon-5423e28af991c9526bfe60fa57952d519fa289d42468ee1296a6fcfa19a63fe1.scope\": RecentStats: unable to find data in memory cache]" Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.232486 4848 generic.go:334] "Generic (PLEG): container finished" podID="30e48298-cbbd-4637-83a9-733efaaf0756" containerID="85835f82e3fb542c8d6464a4cefe9bebff559dd555b1c01f49ba1ee3522d033d" exitCode=0 Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.232587 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"30e48298-cbbd-4637-83a9-733efaaf0756","Type":"ContainerDied","Data":"85835f82e3fb542c8d6464a4cefe9bebff559dd555b1c01f49ba1ee3522d033d"} Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.237271 4848 generic.go:334] "Generic (PLEG): container finished" podID="b5fc69cf-e25e-4ca3-adc3-36b1678691e1" containerID="5423e28af991c9526bfe60fa57952d519fa289d42468ee1296a6fcfa19a63fe1" exitCode=0 Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.237308 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b5fc69cf-e25e-4ca3-adc3-36b1678691e1","Type":"ContainerDied","Data":"5423e28af991c9526bfe60fa57952d519fa289d42468ee1296a6fcfa19a63fe1"} Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.310254 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29"] Feb 17 09:25:54 crc kubenswrapper[4848]: E0217 09:25:54.310702 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1d09ea-fbdf-4703-8a13-e7a86b588372" containerName="init" Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.310720 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1d09ea-fbdf-4703-8a13-e7a86b588372" containerName="init" Feb 17 09:25:54 crc kubenswrapper[4848]: E0217 09:25:54.310741 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87968358-75c5-4fb2-b3b1-8cf4e806611d" containerName="init" Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.310749 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="87968358-75c5-4fb2-b3b1-8cf4e806611d" containerName="init" Feb 17 09:25:54 crc kubenswrapper[4848]: E0217 09:25:54.310772 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87968358-75c5-4fb2-b3b1-8cf4e806611d" containerName="dnsmasq-dns" Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.310780 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="87968358-75c5-4fb2-b3b1-8cf4e806611d" containerName="dnsmasq-dns" Feb 17 09:25:54 crc kubenswrapper[4848]: E0217 09:25:54.310794 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e1d09ea-fbdf-4703-8a13-e7a86b588372" containerName="dnsmasq-dns" Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.310801 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e1d09ea-fbdf-4703-8a13-e7a86b588372" containerName="dnsmasq-dns" Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.310983 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="87968358-75c5-4fb2-b3b1-8cf4e806611d" containerName="dnsmasq-dns" Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.311155 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e1d09ea-fbdf-4703-8a13-e7a86b588372" containerName="dnsmasq-dns" Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.311744 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29" Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.315592 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.315885 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.315886 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.316883 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-df2nq" Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.338251 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29"] Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.403473 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dc6526d-a2c1-40d1-a503-71c4315cc00c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29\" (UID: \"6dc6526d-a2c1-40d1-a503-71c4315cc00c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29" Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.404039 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6dc6526d-a2c1-40d1-a503-71c4315cc00c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29\" (UID: \"6dc6526d-a2c1-40d1-a503-71c4315cc00c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29" Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.404127 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6dc6526d-a2c1-40d1-a503-71c4315cc00c-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29\" (UID: \"6dc6526d-a2c1-40d1-a503-71c4315cc00c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29" Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.404183 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkw5w\" (UniqueName: \"kubernetes.io/projected/6dc6526d-a2c1-40d1-a503-71c4315cc00c-kube-api-access-rkw5w\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29\" (UID: \"6dc6526d-a2c1-40d1-a503-71c4315cc00c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29" Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.505282 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dc6526d-a2c1-40d1-a503-71c4315cc00c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29\" (UID: \"6dc6526d-a2c1-40d1-a503-71c4315cc00c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29" Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.505395 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6dc6526d-a2c1-40d1-a503-71c4315cc00c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29\" (UID: \"6dc6526d-a2c1-40d1-a503-71c4315cc00c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29" Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.505435 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6dc6526d-a2c1-40d1-a503-71c4315cc00c-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29\" (UID: \"6dc6526d-a2c1-40d1-a503-71c4315cc00c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29" Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.505469 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkw5w\" (UniqueName: \"kubernetes.io/projected/6dc6526d-a2c1-40d1-a503-71c4315cc00c-kube-api-access-rkw5w\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29\" (UID: \"6dc6526d-a2c1-40d1-a503-71c4315cc00c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29" Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.511149 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dc6526d-a2c1-40d1-a503-71c4315cc00c-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29\" (UID: \"6dc6526d-a2c1-40d1-a503-71c4315cc00c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29" Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.511306 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6dc6526d-a2c1-40d1-a503-71c4315cc00c-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29\" (UID: \"6dc6526d-a2c1-40d1-a503-71c4315cc00c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29" Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.516477 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6dc6526d-a2c1-40d1-a503-71c4315cc00c-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29\" (UID: \"6dc6526d-a2c1-40d1-a503-71c4315cc00c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29" Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.523864 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkw5w\" (UniqueName: \"kubernetes.io/projected/6dc6526d-a2c1-40d1-a503-71c4315cc00c-kube-api-access-rkw5w\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29\" (UID: \"6dc6526d-a2c1-40d1-a503-71c4315cc00c\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29" Feb 17 09:25:54 crc kubenswrapper[4848]: I0217 09:25:54.752414 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29" Feb 17 09:25:55 crc kubenswrapper[4848]: I0217 09:25:55.246084 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"30e48298-cbbd-4637-83a9-733efaaf0756","Type":"ContainerStarted","Data":"1fa64bf2c0cade82923ecce7fd30fcca4041fea73da8a34b04c84663333cdccc"} Feb 17 09:25:55 crc kubenswrapper[4848]: I0217 09:25:55.246912 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 17 09:25:55 crc kubenswrapper[4848]: I0217 09:25:55.248056 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b5fc69cf-e25e-4ca3-adc3-36b1678691e1","Type":"ContainerStarted","Data":"33aa05d7b772d9d808b92f0e952b3fbef4078d1297b3302664e531603ae2e477"} Feb 17 09:25:55 crc kubenswrapper[4848]: I0217 09:25:55.248722 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:25:55 crc kubenswrapper[4848]: I0217 09:25:55.272070 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.272049041 podStartE2EDuration="37.272049041s" podCreationTimestamp="2026-02-17 09:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:25:55.270770695 +0000 UTC m=+1232.814026341" watchObservedRunningTime="2026-02-17 09:25:55.272049041 +0000 UTC m=+1232.815304687" Feb 17 09:25:55 crc kubenswrapper[4848]: I0217 09:25:55.305446 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.305425634 podStartE2EDuration="36.305425634s" podCreationTimestamp="2026-02-17 09:25:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:25:55.297355724 +0000 UTC m=+1232.840611360" watchObservedRunningTime="2026-02-17 09:25:55.305425634 +0000 UTC m=+1232.848681280" Feb 17 09:25:55 crc kubenswrapper[4848]: I0217 09:25:55.444109 4848 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 09:25:55 crc kubenswrapper[4848]: I0217 09:25:55.448016 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29"] Feb 17 09:25:56 crc kubenswrapper[4848]: I0217 09:25:56.258319 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29" event={"ID":"6dc6526d-a2c1-40d1-a503-71c4315cc00c","Type":"ContainerStarted","Data":"6dc2cb26a78637981d4ef6c2186f6d380cb80b317614f935c9664b5d1a06ce01"} Feb 17 09:26:04 crc kubenswrapper[4848]: I0217 09:26:04.371969 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29" event={"ID":"6dc6526d-a2c1-40d1-a503-71c4315cc00c","Type":"ContainerStarted","Data":"101feb8ae46f5f3d055b8b7a0b183c1eaf6068f631c30181b98dde4941c2234a"} Feb 17 09:26:04 crc kubenswrapper[4848]: I0217 09:26:04.395868 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29" podStartSLOduration=2.178414829 podStartE2EDuration="10.395847792s" podCreationTimestamp="2026-02-17 09:25:54 +0000 UTC" firstStartedPulling="2026-02-17 09:25:55.443887238 +0000 UTC m=+1232.987142884" lastFinishedPulling="2026-02-17 09:26:03.661320191 +0000 UTC m=+1241.204575847" observedRunningTime="2026-02-17 09:26:04.388094681 +0000 UTC m=+1241.931350347" watchObservedRunningTime="2026-02-17 09:26:04.395847792 +0000 UTC m=+1241.939103438" Feb 17 09:26:09 crc kubenswrapper[4848]: I0217 09:26:09.207137 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 17 09:26:09 crc kubenswrapper[4848]: I0217 09:26:09.450982 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 17 09:26:14 crc kubenswrapper[4848]: I0217 09:26:14.495421 4848 generic.go:334] "Generic (PLEG): container finished" podID="6dc6526d-a2c1-40d1-a503-71c4315cc00c" containerID="101feb8ae46f5f3d055b8b7a0b183c1eaf6068f631c30181b98dde4941c2234a" exitCode=0 Feb 17 09:26:14 crc kubenswrapper[4848]: I0217 09:26:14.495547 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29" event={"ID":"6dc6526d-a2c1-40d1-a503-71c4315cc00c","Type":"ContainerDied","Data":"101feb8ae46f5f3d055b8b7a0b183c1eaf6068f631c30181b98dde4941c2234a"} Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.099296 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29" Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.274831 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkw5w\" (UniqueName: \"kubernetes.io/projected/6dc6526d-a2c1-40d1-a503-71c4315cc00c-kube-api-access-rkw5w\") pod \"6dc6526d-a2c1-40d1-a503-71c4315cc00c\" (UID: \"6dc6526d-a2c1-40d1-a503-71c4315cc00c\") " Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.274902 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6dc6526d-a2c1-40d1-a503-71c4315cc00c-inventory\") pod \"6dc6526d-a2c1-40d1-a503-71c4315cc00c\" (UID: \"6dc6526d-a2c1-40d1-a503-71c4315cc00c\") " Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.274965 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6dc6526d-a2c1-40d1-a503-71c4315cc00c-ssh-key-openstack-edpm-ipam\") pod \"6dc6526d-a2c1-40d1-a503-71c4315cc00c\" (UID: \"6dc6526d-a2c1-40d1-a503-71c4315cc00c\") " Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.275047 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dc6526d-a2c1-40d1-a503-71c4315cc00c-repo-setup-combined-ca-bundle\") pod \"6dc6526d-a2c1-40d1-a503-71c4315cc00c\" (UID: \"6dc6526d-a2c1-40d1-a503-71c4315cc00c\") " Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.284115 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dc6526d-a2c1-40d1-a503-71c4315cc00c-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "6dc6526d-a2c1-40d1-a503-71c4315cc00c" (UID: "6dc6526d-a2c1-40d1-a503-71c4315cc00c"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.284583 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6dc6526d-a2c1-40d1-a503-71c4315cc00c-kube-api-access-rkw5w" (OuterVolumeSpecName: "kube-api-access-rkw5w") pod "6dc6526d-a2c1-40d1-a503-71c4315cc00c" (UID: "6dc6526d-a2c1-40d1-a503-71c4315cc00c"). InnerVolumeSpecName "kube-api-access-rkw5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.329959 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dc6526d-a2c1-40d1-a503-71c4315cc00c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6dc6526d-a2c1-40d1-a503-71c4315cc00c" (UID: "6dc6526d-a2c1-40d1-a503-71c4315cc00c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.331721 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6dc6526d-a2c1-40d1-a503-71c4315cc00c-inventory" (OuterVolumeSpecName: "inventory") pod "6dc6526d-a2c1-40d1-a503-71c4315cc00c" (UID: "6dc6526d-a2c1-40d1-a503-71c4315cc00c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.377968 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkw5w\" (UniqueName: \"kubernetes.io/projected/6dc6526d-a2c1-40d1-a503-71c4315cc00c-kube-api-access-rkw5w\") on node \"crc\" DevicePath \"\"" Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.378013 4848 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6dc6526d-a2c1-40d1-a503-71c4315cc00c-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.378029 4848 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6dc6526d-a2c1-40d1-a503-71c4315cc00c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.378042 4848 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6dc6526d-a2c1-40d1-a503-71c4315cc00c-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.732950 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29" event={"ID":"6dc6526d-a2c1-40d1-a503-71c4315cc00c","Type":"ContainerDied","Data":"6dc2cb26a78637981d4ef6c2186f6d380cb80b317614f935c9664b5d1a06ce01"} Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.732996 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dc2cb26a78637981d4ef6c2186f6d380cb80b317614f935c9664b5d1a06ce01" Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.733006 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29" Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.778712 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-hm9jz"] Feb 17 09:26:16 crc kubenswrapper[4848]: E0217 09:26:16.779520 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6dc6526d-a2c1-40d1-a503-71c4315cc00c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.779538 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="6dc6526d-a2c1-40d1-a503-71c4315cc00c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.779751 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="6dc6526d-a2c1-40d1-a503-71c4315cc00c" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.780342 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hm9jz" Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.783227 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.783394 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-df2nq" Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.783227 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.784597 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.789244 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-hm9jz"] Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.885740 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9762dbd7-6ed8-433c-a176-402586491e40-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hm9jz\" (UID: \"9762dbd7-6ed8-433c-a176-402586491e40\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hm9jz" Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.885848 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbbxw\" (UniqueName: \"kubernetes.io/projected/9762dbd7-6ed8-433c-a176-402586491e40-kube-api-access-gbbxw\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hm9jz\" (UID: \"9762dbd7-6ed8-433c-a176-402586491e40\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hm9jz" Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.885913 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9762dbd7-6ed8-433c-a176-402586491e40-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hm9jz\" (UID: \"9762dbd7-6ed8-433c-a176-402586491e40\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hm9jz" Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.987653 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9762dbd7-6ed8-433c-a176-402586491e40-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hm9jz\" (UID: \"9762dbd7-6ed8-433c-a176-402586491e40\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hm9jz" Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.987985 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbbxw\" (UniqueName: \"kubernetes.io/projected/9762dbd7-6ed8-433c-a176-402586491e40-kube-api-access-gbbxw\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hm9jz\" (UID: \"9762dbd7-6ed8-433c-a176-402586491e40\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hm9jz" Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.988120 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9762dbd7-6ed8-433c-a176-402586491e40-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hm9jz\" (UID: \"9762dbd7-6ed8-433c-a176-402586491e40\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hm9jz" Feb 17 09:26:16 crc kubenswrapper[4848]: I0217 09:26:16.994605 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9762dbd7-6ed8-433c-a176-402586491e40-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hm9jz\" (UID: \"9762dbd7-6ed8-433c-a176-402586491e40\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hm9jz" Feb 17 09:26:17 crc kubenswrapper[4848]: I0217 09:26:17.002200 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9762dbd7-6ed8-433c-a176-402586491e40-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hm9jz\" (UID: \"9762dbd7-6ed8-433c-a176-402586491e40\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hm9jz" Feb 17 09:26:17 crc kubenswrapper[4848]: I0217 09:26:17.004534 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbbxw\" (UniqueName: \"kubernetes.io/projected/9762dbd7-6ed8-433c-a176-402586491e40-kube-api-access-gbbxw\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-hm9jz\" (UID: \"9762dbd7-6ed8-433c-a176-402586491e40\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hm9jz" Feb 17 09:26:17 crc kubenswrapper[4848]: I0217 09:26:17.100669 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hm9jz" Feb 17 09:26:17 crc kubenswrapper[4848]: I0217 09:26:17.730994 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-hm9jz"] Feb 17 09:26:18 crc kubenswrapper[4848]: I0217 09:26:18.759888 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hm9jz" event={"ID":"9762dbd7-6ed8-433c-a176-402586491e40","Type":"ContainerStarted","Data":"8e581dba53cfb8749ac251a3c96f283758f38bbf9e4a3698b716735638048368"} Feb 17 09:26:18 crc kubenswrapper[4848]: I0217 09:26:18.760470 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hm9jz" event={"ID":"9762dbd7-6ed8-433c-a176-402586491e40","Type":"ContainerStarted","Data":"f57955ff1dc9be68354df274ee025858c869c3ed34f4d1b3358cb5ef6fc89936"} Feb 17 09:26:18 crc kubenswrapper[4848]: I0217 09:26:18.772531 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:26:18 crc kubenswrapper[4848]: I0217 09:26:18.772677 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:26:18 crc kubenswrapper[4848]: I0217 09:26:18.772806 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:26:18 crc kubenswrapper[4848]: I0217 09:26:18.773947 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"42389abaf9cd91ca5ab94c566a487ee2516f882d6f353baa7369223b1e0966e6"} pod="openshift-machine-config-operator/machine-config-daemon-stvnz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 09:26:18 crc kubenswrapper[4848]: I0217 09:26:18.774105 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" containerID="cri-o://42389abaf9cd91ca5ab94c566a487ee2516f882d6f353baa7369223b1e0966e6" gracePeriod=600 Feb 17 09:26:18 crc kubenswrapper[4848]: I0217 09:26:18.795697 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hm9jz" podStartSLOduration=2.40464859 podStartE2EDuration="2.795670085s" podCreationTimestamp="2026-02-17 09:26:16 +0000 UTC" firstStartedPulling="2026-02-17 09:26:17.745158591 +0000 UTC m=+1255.288414267" lastFinishedPulling="2026-02-17 09:26:18.136180086 +0000 UTC m=+1255.679435762" observedRunningTime="2026-02-17 09:26:18.779209315 +0000 UTC m=+1256.322465051" watchObservedRunningTime="2026-02-17 09:26:18.795670085 +0000 UTC m=+1256.338925771" Feb 17 09:26:19 crc kubenswrapper[4848]: I0217 09:26:19.778815 4848 generic.go:334] "Generic (PLEG): container finished" podID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerID="42389abaf9cd91ca5ab94c566a487ee2516f882d6f353baa7369223b1e0966e6" exitCode=0 Feb 17 09:26:19 crc kubenswrapper[4848]: I0217 09:26:19.781115 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerDied","Data":"42389abaf9cd91ca5ab94c566a487ee2516f882d6f353baa7369223b1e0966e6"} Feb 17 09:26:19 crc kubenswrapper[4848]: I0217 09:26:19.781175 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerStarted","Data":"7308c58dc682de7c0d5cf3657a75860bc4b9b36b777a9bde17b6a927094a6302"} Feb 17 09:26:19 crc kubenswrapper[4848]: I0217 09:26:19.781206 4848 scope.go:117] "RemoveContainer" containerID="394ea7e530ba2c06a8e2c57a8f43255e3afc07d0c7f59b99a48b84ecd7fdc2a0" Feb 17 09:26:21 crc kubenswrapper[4848]: I0217 09:26:21.815350 4848 generic.go:334] "Generic (PLEG): container finished" podID="9762dbd7-6ed8-433c-a176-402586491e40" containerID="8e581dba53cfb8749ac251a3c96f283758f38bbf9e4a3698b716735638048368" exitCode=0 Feb 17 09:26:21 crc kubenswrapper[4848]: I0217 09:26:21.815469 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hm9jz" event={"ID":"9762dbd7-6ed8-433c-a176-402586491e40","Type":"ContainerDied","Data":"8e581dba53cfb8749ac251a3c96f283758f38bbf9e4a3698b716735638048368"} Feb 17 09:26:23 crc kubenswrapper[4848]: I0217 09:26:23.237505 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hm9jz" Feb 17 09:26:23 crc kubenswrapper[4848]: I0217 09:26:23.325466 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9762dbd7-6ed8-433c-a176-402586491e40-ssh-key-openstack-edpm-ipam\") pod \"9762dbd7-6ed8-433c-a176-402586491e40\" (UID: \"9762dbd7-6ed8-433c-a176-402586491e40\") " Feb 17 09:26:23 crc kubenswrapper[4848]: I0217 09:26:23.325781 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9762dbd7-6ed8-433c-a176-402586491e40-inventory\") pod \"9762dbd7-6ed8-433c-a176-402586491e40\" (UID: \"9762dbd7-6ed8-433c-a176-402586491e40\") " Feb 17 09:26:23 crc kubenswrapper[4848]: I0217 09:26:23.325902 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbbxw\" (UniqueName: \"kubernetes.io/projected/9762dbd7-6ed8-433c-a176-402586491e40-kube-api-access-gbbxw\") pod \"9762dbd7-6ed8-433c-a176-402586491e40\" (UID: \"9762dbd7-6ed8-433c-a176-402586491e40\") " Feb 17 09:26:23 crc kubenswrapper[4848]: I0217 09:26:23.332493 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9762dbd7-6ed8-433c-a176-402586491e40-kube-api-access-gbbxw" (OuterVolumeSpecName: "kube-api-access-gbbxw") pod "9762dbd7-6ed8-433c-a176-402586491e40" (UID: "9762dbd7-6ed8-433c-a176-402586491e40"). InnerVolumeSpecName "kube-api-access-gbbxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:26:23 crc kubenswrapper[4848]: I0217 09:26:23.357747 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9762dbd7-6ed8-433c-a176-402586491e40-inventory" (OuterVolumeSpecName: "inventory") pod "9762dbd7-6ed8-433c-a176-402586491e40" (UID: "9762dbd7-6ed8-433c-a176-402586491e40"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:26:23 crc kubenswrapper[4848]: I0217 09:26:23.358966 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9762dbd7-6ed8-433c-a176-402586491e40-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9762dbd7-6ed8-433c-a176-402586491e40" (UID: "9762dbd7-6ed8-433c-a176-402586491e40"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:26:23 crc kubenswrapper[4848]: I0217 09:26:23.427859 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbbxw\" (UniqueName: \"kubernetes.io/projected/9762dbd7-6ed8-433c-a176-402586491e40-kube-api-access-gbbxw\") on node \"crc\" DevicePath \"\"" Feb 17 09:26:23 crc kubenswrapper[4848]: I0217 09:26:23.427887 4848 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9762dbd7-6ed8-433c-a176-402586491e40-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 09:26:23 crc kubenswrapper[4848]: I0217 09:26:23.427897 4848 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9762dbd7-6ed8-433c-a176-402586491e40-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 09:26:23 crc kubenswrapper[4848]: I0217 09:26:23.840843 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hm9jz" event={"ID":"9762dbd7-6ed8-433c-a176-402586491e40","Type":"ContainerDied","Data":"f57955ff1dc9be68354df274ee025858c869c3ed34f4d1b3358cb5ef6fc89936"} Feb 17 09:26:23 crc kubenswrapper[4848]: I0217 09:26:23.841096 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f57955ff1dc9be68354df274ee025858c869c3ed34f4d1b3358cb5ef6fc89936" Feb 17 09:26:23 crc kubenswrapper[4848]: I0217 09:26:23.841149 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-hm9jz" Feb 17 09:26:23 crc kubenswrapper[4848]: I0217 09:26:23.931138 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2"] Feb 17 09:26:23 crc kubenswrapper[4848]: E0217 09:26:23.932060 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9762dbd7-6ed8-433c-a176-402586491e40" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 17 09:26:23 crc kubenswrapper[4848]: I0217 09:26:23.932142 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9762dbd7-6ed8-433c-a176-402586491e40" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 17 09:26:23 crc kubenswrapper[4848]: I0217 09:26:23.932511 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="9762dbd7-6ed8-433c-a176-402586491e40" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 17 09:26:23 crc kubenswrapper[4848]: I0217 09:26:23.933875 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2" Feb 17 09:26:23 crc kubenswrapper[4848]: I0217 09:26:23.937002 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 09:26:23 crc kubenswrapper[4848]: I0217 09:26:23.937272 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 09:26:23 crc kubenswrapper[4848]: I0217 09:26:23.937602 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 09:26:23 crc kubenswrapper[4848]: I0217 09:26:23.938539 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-df2nq" Feb 17 09:26:23 crc kubenswrapper[4848]: I0217 09:26:23.944431 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2"] Feb 17 09:26:24 crc kubenswrapper[4848]: I0217 09:26:24.036890 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l48tw\" (UniqueName: \"kubernetes.io/projected/9eaf40fa-e0f2-445b-a17b-98f88fc76a5e-kube-api-access-l48tw\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2\" (UID: \"9eaf40fa-e0f2-445b-a17b-98f88fc76a5e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2" Feb 17 09:26:24 crc kubenswrapper[4848]: I0217 09:26:24.036948 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eaf40fa-e0f2-445b-a17b-98f88fc76a5e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2\" (UID: \"9eaf40fa-e0f2-445b-a17b-98f88fc76a5e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2" Feb 17 09:26:24 crc kubenswrapper[4848]: I0217 09:26:24.036995 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9eaf40fa-e0f2-445b-a17b-98f88fc76a5e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2\" (UID: \"9eaf40fa-e0f2-445b-a17b-98f88fc76a5e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2" Feb 17 09:26:24 crc kubenswrapper[4848]: I0217 09:26:24.037102 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9eaf40fa-e0f2-445b-a17b-98f88fc76a5e-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2\" (UID: \"9eaf40fa-e0f2-445b-a17b-98f88fc76a5e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2" Feb 17 09:26:24 crc kubenswrapper[4848]: I0217 09:26:24.138863 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l48tw\" (UniqueName: \"kubernetes.io/projected/9eaf40fa-e0f2-445b-a17b-98f88fc76a5e-kube-api-access-l48tw\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2\" (UID: \"9eaf40fa-e0f2-445b-a17b-98f88fc76a5e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2" Feb 17 09:26:24 crc kubenswrapper[4848]: I0217 09:26:24.139150 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eaf40fa-e0f2-445b-a17b-98f88fc76a5e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2\" (UID: \"9eaf40fa-e0f2-445b-a17b-98f88fc76a5e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2" Feb 17 09:26:24 crc kubenswrapper[4848]: I0217 09:26:24.139192 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9eaf40fa-e0f2-445b-a17b-98f88fc76a5e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2\" (UID: \"9eaf40fa-e0f2-445b-a17b-98f88fc76a5e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2" Feb 17 09:26:24 crc kubenswrapper[4848]: I0217 09:26:24.139412 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9eaf40fa-e0f2-445b-a17b-98f88fc76a5e-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2\" (UID: \"9eaf40fa-e0f2-445b-a17b-98f88fc76a5e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2" Feb 17 09:26:24 crc kubenswrapper[4848]: I0217 09:26:24.145379 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9eaf40fa-e0f2-445b-a17b-98f88fc76a5e-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2\" (UID: \"9eaf40fa-e0f2-445b-a17b-98f88fc76a5e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2" Feb 17 09:26:24 crc kubenswrapper[4848]: I0217 09:26:24.145438 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9eaf40fa-e0f2-445b-a17b-98f88fc76a5e-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2\" (UID: \"9eaf40fa-e0f2-445b-a17b-98f88fc76a5e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2" Feb 17 09:26:24 crc kubenswrapper[4848]: I0217 09:26:24.150540 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eaf40fa-e0f2-445b-a17b-98f88fc76a5e-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2\" (UID: \"9eaf40fa-e0f2-445b-a17b-98f88fc76a5e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2" Feb 17 09:26:24 crc kubenswrapper[4848]: I0217 09:26:24.163328 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l48tw\" (UniqueName: \"kubernetes.io/projected/9eaf40fa-e0f2-445b-a17b-98f88fc76a5e-kube-api-access-l48tw\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2\" (UID: \"9eaf40fa-e0f2-445b-a17b-98f88fc76a5e\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2" Feb 17 09:26:24 crc kubenswrapper[4848]: I0217 09:26:24.266046 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2" Feb 17 09:26:24 crc kubenswrapper[4848]: I0217 09:26:24.913184 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2"] Feb 17 09:26:25 crc kubenswrapper[4848]: I0217 09:26:25.534824 4848 scope.go:117] "RemoveContainer" containerID="f56f15d61fe7db37060d2e564594fc7ed18b336bb021a7204f188d4b1fc14305" Feb 17 09:26:25 crc kubenswrapper[4848]: I0217 09:26:25.582261 4848 scope.go:117] "RemoveContainer" containerID="5ef7f25940670c927395f21a3af49da52d6a2cd3cf0f6c7a25327fc3127e9ac1" Feb 17 09:26:25 crc kubenswrapper[4848]: I0217 09:26:25.887636 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2" event={"ID":"9eaf40fa-e0f2-445b-a17b-98f88fc76a5e","Type":"ContainerStarted","Data":"9171ce8935eae8a5fb8cbf1489bce86dbb074c5eb5dabc96ea830c213eb6c733"} Feb 17 09:26:25 crc kubenswrapper[4848]: I0217 09:26:25.887678 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2" event={"ID":"9eaf40fa-e0f2-445b-a17b-98f88fc76a5e","Type":"ContainerStarted","Data":"2ac71df747b20b5f1c8cad62ff0524345c3ca5819f8310224f923774b1b6e984"} Feb 17 09:26:25 crc kubenswrapper[4848]: I0217 09:26:25.914968 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2" podStartSLOduration=2.508310303 podStartE2EDuration="2.914946614s" podCreationTimestamp="2026-02-17 09:26:23 +0000 UTC" firstStartedPulling="2026-02-17 09:26:24.904896485 +0000 UTC m=+1262.448152141" lastFinishedPulling="2026-02-17 09:26:25.311532796 +0000 UTC m=+1262.854788452" observedRunningTime="2026-02-17 09:26:25.908306214 +0000 UTC m=+1263.451561870" watchObservedRunningTime="2026-02-17 09:26:25.914946614 +0000 UTC m=+1263.458202270" Feb 17 09:27:25 crc kubenswrapper[4848]: I0217 09:27:25.690938 4848 scope.go:117] "RemoveContainer" containerID="d14973f2fd21fcd48a589b585106fc57fe7b224455c3b47a22d9cdc4555c7755" Feb 17 09:27:25 crc kubenswrapper[4848]: I0217 09:27:25.749827 4848 scope.go:117] "RemoveContainer" containerID="53a02f43c94be8c0acb29110d602ddec1d1152e20f6142c0b68c52c159cdf7c9" Feb 17 09:27:25 crc kubenswrapper[4848]: I0217 09:27:25.815056 4848 scope.go:117] "RemoveContainer" containerID="626b49679a5d5e4c9c615170a4f40a8f6453615e3c804816282db06d7d975ac2" Feb 17 09:27:25 crc kubenswrapper[4848]: I0217 09:27:25.847538 4848 scope.go:117] "RemoveContainer" containerID="8269c1d1041d03962c00dcb5cfdc6acea52c87c556d0d5f799bb123be105cd15" Feb 17 09:27:25 crc kubenswrapper[4848]: I0217 09:27:25.882043 4848 scope.go:117] "RemoveContainer" containerID="865528d5abf599d39a4c21ab26182a61d99de95a407b3de598a7e1735ddb1ae5" Feb 17 09:27:25 crc kubenswrapper[4848]: I0217 09:27:25.902434 4848 scope.go:117] "RemoveContainer" containerID="a756de62b5a6db3232fab62b0e393c42a2f0617bf011c488351948bf3ac8cc14" Feb 17 09:27:25 crc kubenswrapper[4848]: I0217 09:27:25.940461 4848 scope.go:117] "RemoveContainer" containerID="ac6fa42deaa2a02c23979dcd4cf755d438c45e25801f01b8395073c61667e26e" Feb 17 09:28:26 crc kubenswrapper[4848]: I0217 09:28:26.079424 4848 scope.go:117] "RemoveContainer" containerID="79028113ed783eab5500a25672d8bee328b17042b779d0531bb0c6fbca5506b4" Feb 17 09:28:48 crc kubenswrapper[4848]: I0217 09:28:48.771133 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:28:48 crc kubenswrapper[4848]: I0217 09:28:48.771628 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:29:14 crc kubenswrapper[4848]: I0217 09:29:14.810580 4848 generic.go:334] "Generic (PLEG): container finished" podID="9eaf40fa-e0f2-445b-a17b-98f88fc76a5e" containerID="9171ce8935eae8a5fb8cbf1489bce86dbb074c5eb5dabc96ea830c213eb6c733" exitCode=0 Feb 17 09:29:14 crc kubenswrapper[4848]: I0217 09:29:14.810675 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2" event={"ID":"9eaf40fa-e0f2-445b-a17b-98f88fc76a5e","Type":"ContainerDied","Data":"9171ce8935eae8a5fb8cbf1489bce86dbb074c5eb5dabc96ea830c213eb6c733"} Feb 17 09:29:16 crc kubenswrapper[4848]: I0217 09:29:16.239626 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2" Feb 17 09:29:16 crc kubenswrapper[4848]: I0217 09:29:16.366188 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l48tw\" (UniqueName: \"kubernetes.io/projected/9eaf40fa-e0f2-445b-a17b-98f88fc76a5e-kube-api-access-l48tw\") pod \"9eaf40fa-e0f2-445b-a17b-98f88fc76a5e\" (UID: \"9eaf40fa-e0f2-445b-a17b-98f88fc76a5e\") " Feb 17 09:29:16 crc kubenswrapper[4848]: I0217 09:29:16.366366 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9eaf40fa-e0f2-445b-a17b-98f88fc76a5e-ssh-key-openstack-edpm-ipam\") pod \"9eaf40fa-e0f2-445b-a17b-98f88fc76a5e\" (UID: \"9eaf40fa-e0f2-445b-a17b-98f88fc76a5e\") " Feb 17 09:29:16 crc kubenswrapper[4848]: I0217 09:29:16.366632 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9eaf40fa-e0f2-445b-a17b-98f88fc76a5e-inventory\") pod \"9eaf40fa-e0f2-445b-a17b-98f88fc76a5e\" (UID: \"9eaf40fa-e0f2-445b-a17b-98f88fc76a5e\") " Feb 17 09:29:16 crc kubenswrapper[4848]: I0217 09:29:16.366893 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eaf40fa-e0f2-445b-a17b-98f88fc76a5e-bootstrap-combined-ca-bundle\") pod \"9eaf40fa-e0f2-445b-a17b-98f88fc76a5e\" (UID: \"9eaf40fa-e0f2-445b-a17b-98f88fc76a5e\") " Feb 17 09:29:16 crc kubenswrapper[4848]: I0217 09:29:16.373110 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eaf40fa-e0f2-445b-a17b-98f88fc76a5e-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "9eaf40fa-e0f2-445b-a17b-98f88fc76a5e" (UID: "9eaf40fa-e0f2-445b-a17b-98f88fc76a5e"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:29:16 crc kubenswrapper[4848]: I0217 09:29:16.374128 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9eaf40fa-e0f2-445b-a17b-98f88fc76a5e-kube-api-access-l48tw" (OuterVolumeSpecName: "kube-api-access-l48tw") pod "9eaf40fa-e0f2-445b-a17b-98f88fc76a5e" (UID: "9eaf40fa-e0f2-445b-a17b-98f88fc76a5e"). InnerVolumeSpecName "kube-api-access-l48tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:29:16 crc kubenswrapper[4848]: I0217 09:29:16.403240 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eaf40fa-e0f2-445b-a17b-98f88fc76a5e-inventory" (OuterVolumeSpecName: "inventory") pod "9eaf40fa-e0f2-445b-a17b-98f88fc76a5e" (UID: "9eaf40fa-e0f2-445b-a17b-98f88fc76a5e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:29:16 crc kubenswrapper[4848]: I0217 09:29:16.416618 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9eaf40fa-e0f2-445b-a17b-98f88fc76a5e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9eaf40fa-e0f2-445b-a17b-98f88fc76a5e" (UID: "9eaf40fa-e0f2-445b-a17b-98f88fc76a5e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:29:16 crc kubenswrapper[4848]: I0217 09:29:16.471821 4848 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9eaf40fa-e0f2-445b-a17b-98f88fc76a5e-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:29:16 crc kubenswrapper[4848]: I0217 09:29:16.471873 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l48tw\" (UniqueName: \"kubernetes.io/projected/9eaf40fa-e0f2-445b-a17b-98f88fc76a5e-kube-api-access-l48tw\") on node \"crc\" DevicePath \"\"" Feb 17 09:29:16 crc kubenswrapper[4848]: I0217 09:29:16.471897 4848 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9eaf40fa-e0f2-445b-a17b-98f88fc76a5e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 09:29:16 crc kubenswrapper[4848]: I0217 09:29:16.471924 4848 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9eaf40fa-e0f2-445b-a17b-98f88fc76a5e-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 09:29:16 crc kubenswrapper[4848]: I0217 09:29:16.837728 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2" event={"ID":"9eaf40fa-e0f2-445b-a17b-98f88fc76a5e","Type":"ContainerDied","Data":"2ac71df747b20b5f1c8cad62ff0524345c3ca5819f8310224f923774b1b6e984"} Feb 17 09:29:16 crc kubenswrapper[4848]: I0217 09:29:16.838265 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ac71df747b20b5f1c8cad62ff0524345c3ca5819f8310224f923774b1b6e984" Feb 17 09:29:16 crc kubenswrapper[4848]: I0217 09:29:16.837902 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2" Feb 17 09:29:16 crc kubenswrapper[4848]: I0217 09:29:16.954399 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nztrh"] Feb 17 09:29:16 crc kubenswrapper[4848]: E0217 09:29:16.955060 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9eaf40fa-e0f2-445b-a17b-98f88fc76a5e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 17 09:29:16 crc kubenswrapper[4848]: I0217 09:29:16.955090 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9eaf40fa-e0f2-445b-a17b-98f88fc76a5e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 17 09:29:16 crc kubenswrapper[4848]: I0217 09:29:16.955449 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="9eaf40fa-e0f2-445b-a17b-98f88fc76a5e" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 17 09:29:16 crc kubenswrapper[4848]: I0217 09:29:16.962877 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nztrh" Feb 17 09:29:16 crc kubenswrapper[4848]: I0217 09:29:16.966278 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 09:29:16 crc kubenswrapper[4848]: I0217 09:29:16.966922 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 09:29:16 crc kubenswrapper[4848]: I0217 09:29:16.966955 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-df2nq" Feb 17 09:29:16 crc kubenswrapper[4848]: I0217 09:29:16.969966 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 09:29:16 crc kubenswrapper[4848]: I0217 09:29:16.974136 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nztrh"] Feb 17 09:29:17 crc kubenswrapper[4848]: I0217 09:29:17.083190 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/502cc85d-fb83-4c34-825f-5aca6c880af7-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nztrh\" (UID: \"502cc85d-fb83-4c34-825f-5aca6c880af7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nztrh" Feb 17 09:29:17 crc kubenswrapper[4848]: I0217 09:29:17.083255 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mwmh\" (UniqueName: \"kubernetes.io/projected/502cc85d-fb83-4c34-825f-5aca6c880af7-kube-api-access-6mwmh\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nztrh\" (UID: \"502cc85d-fb83-4c34-825f-5aca6c880af7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nztrh" Feb 17 09:29:17 crc kubenswrapper[4848]: I0217 09:29:17.083425 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/502cc85d-fb83-4c34-825f-5aca6c880af7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nztrh\" (UID: \"502cc85d-fb83-4c34-825f-5aca6c880af7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nztrh" Feb 17 09:29:17 crc kubenswrapper[4848]: I0217 09:29:17.185468 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/502cc85d-fb83-4c34-825f-5aca6c880af7-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nztrh\" (UID: \"502cc85d-fb83-4c34-825f-5aca6c880af7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nztrh" Feb 17 09:29:17 crc kubenswrapper[4848]: I0217 09:29:17.185570 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mwmh\" (UniqueName: \"kubernetes.io/projected/502cc85d-fb83-4c34-825f-5aca6c880af7-kube-api-access-6mwmh\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nztrh\" (UID: \"502cc85d-fb83-4c34-825f-5aca6c880af7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nztrh" Feb 17 09:29:17 crc kubenswrapper[4848]: I0217 09:29:17.185721 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/502cc85d-fb83-4c34-825f-5aca6c880af7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nztrh\" (UID: \"502cc85d-fb83-4c34-825f-5aca6c880af7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nztrh" Feb 17 09:29:17 crc kubenswrapper[4848]: I0217 09:29:17.207334 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/502cc85d-fb83-4c34-825f-5aca6c880af7-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nztrh\" (UID: \"502cc85d-fb83-4c34-825f-5aca6c880af7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nztrh" Feb 17 09:29:17 crc kubenswrapper[4848]: I0217 09:29:17.207334 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/502cc85d-fb83-4c34-825f-5aca6c880af7-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nztrh\" (UID: \"502cc85d-fb83-4c34-825f-5aca6c880af7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nztrh" Feb 17 09:29:17 crc kubenswrapper[4848]: I0217 09:29:17.212544 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mwmh\" (UniqueName: \"kubernetes.io/projected/502cc85d-fb83-4c34-825f-5aca6c880af7-kube-api-access-6mwmh\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-nztrh\" (UID: \"502cc85d-fb83-4c34-825f-5aca6c880af7\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nztrh" Feb 17 09:29:17 crc kubenswrapper[4848]: I0217 09:29:17.302580 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nztrh" Feb 17 09:29:17 crc kubenswrapper[4848]: I0217 09:29:17.891111 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nztrh"] Feb 17 09:29:17 crc kubenswrapper[4848]: W0217 09:29:17.900711 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod502cc85d_fb83_4c34_825f_5aca6c880af7.slice/crio-70096ac1879516f49f3e95dffaa5e264bee090a37decf6ca73da297bf2d3d919 WatchSource:0}: Error finding container 70096ac1879516f49f3e95dffaa5e264bee090a37decf6ca73da297bf2d3d919: Status 404 returned error can't find the container with id 70096ac1879516f49f3e95dffaa5e264bee090a37decf6ca73da297bf2d3d919 Feb 17 09:29:18 crc kubenswrapper[4848]: I0217 09:29:18.771692 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:29:18 crc kubenswrapper[4848]: I0217 09:29:18.772306 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:29:18 crc kubenswrapper[4848]: I0217 09:29:18.861367 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nztrh" event={"ID":"502cc85d-fb83-4c34-825f-5aca6c880af7","Type":"ContainerStarted","Data":"4f475882137f4f3d05ac713efdca46a33b0e1b710bff6150e9d6e732655b8532"} Feb 17 09:29:18 crc kubenswrapper[4848]: I0217 09:29:18.861414 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nztrh" event={"ID":"502cc85d-fb83-4c34-825f-5aca6c880af7","Type":"ContainerStarted","Data":"70096ac1879516f49f3e95dffaa5e264bee090a37decf6ca73da297bf2d3d919"} Feb 17 09:29:18 crc kubenswrapper[4848]: I0217 09:29:18.881926 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nztrh" podStartSLOduration=2.440932939 podStartE2EDuration="2.881899847s" podCreationTimestamp="2026-02-17 09:29:16 +0000 UTC" firstStartedPulling="2026-02-17 09:29:17.903242064 +0000 UTC m=+1435.446497710" lastFinishedPulling="2026-02-17 09:29:18.344208972 +0000 UTC m=+1435.887464618" observedRunningTime="2026-02-17 09:29:18.880131816 +0000 UTC m=+1436.423387532" watchObservedRunningTime="2026-02-17 09:29:18.881899847 +0000 UTC m=+1436.425155533" Feb 17 09:29:48 crc kubenswrapper[4848]: I0217 09:29:48.771423 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:29:48 crc kubenswrapper[4848]: I0217 09:29:48.772052 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:29:48 crc kubenswrapper[4848]: I0217 09:29:48.772113 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:29:48 crc kubenswrapper[4848]: I0217 09:29:48.773054 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7308c58dc682de7c0d5cf3657a75860bc4b9b36b777a9bde17b6a927094a6302"} pod="openshift-machine-config-operator/machine-config-daemon-stvnz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 09:29:48 crc kubenswrapper[4848]: I0217 09:29:48.773155 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" containerID="cri-o://7308c58dc682de7c0d5cf3657a75860bc4b9b36b777a9bde17b6a927094a6302" gracePeriod=600 Feb 17 09:29:49 crc kubenswrapper[4848]: I0217 09:29:49.184845 4848 generic.go:334] "Generic (PLEG): container finished" podID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerID="7308c58dc682de7c0d5cf3657a75860bc4b9b36b777a9bde17b6a927094a6302" exitCode=0 Feb 17 09:29:49 crc kubenswrapper[4848]: I0217 09:29:49.184944 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerDied","Data":"7308c58dc682de7c0d5cf3657a75860bc4b9b36b777a9bde17b6a927094a6302"} Feb 17 09:29:49 crc kubenswrapper[4848]: I0217 09:29:49.185223 4848 scope.go:117] "RemoveContainer" containerID="42389abaf9cd91ca5ab94c566a487ee2516f882d6f353baa7369223b1e0966e6" Feb 17 09:29:50 crc kubenswrapper[4848]: I0217 09:29:50.201983 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerStarted","Data":"e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521"} Feb 17 09:30:00 crc kubenswrapper[4848]: I0217 09:30:00.155931 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522010-fmh2z"] Feb 17 09:30:00 crc kubenswrapper[4848]: I0217 09:30:00.159445 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522010-fmh2z" Feb 17 09:30:00 crc kubenswrapper[4848]: I0217 09:30:00.165862 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 09:30:00 crc kubenswrapper[4848]: I0217 09:30:00.166130 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 09:30:00 crc kubenswrapper[4848]: I0217 09:30:00.173214 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522010-fmh2z"] Feb 17 09:30:00 crc kubenswrapper[4848]: I0217 09:30:00.186633 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c2p4\" (UniqueName: \"kubernetes.io/projected/e13e335f-75a3-43a6-9bd8-7ea62595fa13-kube-api-access-6c2p4\") pod \"collect-profiles-29522010-fmh2z\" (UID: \"e13e335f-75a3-43a6-9bd8-7ea62595fa13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522010-fmh2z" Feb 17 09:30:00 crc kubenswrapper[4848]: I0217 09:30:00.186862 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e13e335f-75a3-43a6-9bd8-7ea62595fa13-secret-volume\") pod \"collect-profiles-29522010-fmh2z\" (UID: \"e13e335f-75a3-43a6-9bd8-7ea62595fa13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522010-fmh2z" Feb 17 09:30:00 crc kubenswrapper[4848]: I0217 09:30:00.186953 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e13e335f-75a3-43a6-9bd8-7ea62595fa13-config-volume\") pod \"collect-profiles-29522010-fmh2z\" (UID: \"e13e335f-75a3-43a6-9bd8-7ea62595fa13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522010-fmh2z" Feb 17 09:30:00 crc kubenswrapper[4848]: I0217 09:30:00.288691 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e13e335f-75a3-43a6-9bd8-7ea62595fa13-secret-volume\") pod \"collect-profiles-29522010-fmh2z\" (UID: \"e13e335f-75a3-43a6-9bd8-7ea62595fa13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522010-fmh2z" Feb 17 09:30:00 crc kubenswrapper[4848]: I0217 09:30:00.288805 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e13e335f-75a3-43a6-9bd8-7ea62595fa13-config-volume\") pod \"collect-profiles-29522010-fmh2z\" (UID: \"e13e335f-75a3-43a6-9bd8-7ea62595fa13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522010-fmh2z" Feb 17 09:30:00 crc kubenswrapper[4848]: I0217 09:30:00.288951 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c2p4\" (UniqueName: \"kubernetes.io/projected/e13e335f-75a3-43a6-9bd8-7ea62595fa13-kube-api-access-6c2p4\") pod \"collect-profiles-29522010-fmh2z\" (UID: \"e13e335f-75a3-43a6-9bd8-7ea62595fa13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522010-fmh2z" Feb 17 09:30:00 crc kubenswrapper[4848]: I0217 09:30:00.289951 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e13e335f-75a3-43a6-9bd8-7ea62595fa13-config-volume\") pod \"collect-profiles-29522010-fmh2z\" (UID: \"e13e335f-75a3-43a6-9bd8-7ea62595fa13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522010-fmh2z" Feb 17 09:30:00 crc kubenswrapper[4848]: I0217 09:30:00.303535 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e13e335f-75a3-43a6-9bd8-7ea62595fa13-secret-volume\") pod \"collect-profiles-29522010-fmh2z\" (UID: \"e13e335f-75a3-43a6-9bd8-7ea62595fa13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522010-fmh2z" Feb 17 09:30:00 crc kubenswrapper[4848]: I0217 09:30:00.307831 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c2p4\" (UniqueName: \"kubernetes.io/projected/e13e335f-75a3-43a6-9bd8-7ea62595fa13-kube-api-access-6c2p4\") pod \"collect-profiles-29522010-fmh2z\" (UID: \"e13e335f-75a3-43a6-9bd8-7ea62595fa13\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522010-fmh2z" Feb 17 09:30:00 crc kubenswrapper[4848]: I0217 09:30:00.488797 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522010-fmh2z" Feb 17 09:30:00 crc kubenswrapper[4848]: I0217 09:30:00.765047 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522010-fmh2z"] Feb 17 09:30:00 crc kubenswrapper[4848]: W0217 09:30:00.775175 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode13e335f_75a3_43a6_9bd8_7ea62595fa13.slice/crio-b5c5fa586132511644b9215976a5dba5fd6df6e0f30a1d5bf20c2c577d36940e WatchSource:0}: Error finding container b5c5fa586132511644b9215976a5dba5fd6df6e0f30a1d5bf20c2c577d36940e: Status 404 returned error can't find the container with id b5c5fa586132511644b9215976a5dba5fd6df6e0f30a1d5bf20c2c577d36940e Feb 17 09:30:01 crc kubenswrapper[4848]: I0217 09:30:01.320945 4848 generic.go:334] "Generic (PLEG): container finished" podID="e13e335f-75a3-43a6-9bd8-7ea62595fa13" containerID="a32e391ea4a9c90128d8d6eb61d47c403d3a552bd511d44c50963883ec28ba50" exitCode=0 Feb 17 09:30:01 crc kubenswrapper[4848]: I0217 09:30:01.321090 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522010-fmh2z" event={"ID":"e13e335f-75a3-43a6-9bd8-7ea62595fa13","Type":"ContainerDied","Data":"a32e391ea4a9c90128d8d6eb61d47c403d3a552bd511d44c50963883ec28ba50"} Feb 17 09:30:01 crc kubenswrapper[4848]: I0217 09:30:01.321414 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522010-fmh2z" event={"ID":"e13e335f-75a3-43a6-9bd8-7ea62595fa13","Type":"ContainerStarted","Data":"b5c5fa586132511644b9215976a5dba5fd6df6e0f30a1d5bf20c2c577d36940e"} Feb 17 09:30:02 crc kubenswrapper[4848]: I0217 09:30:02.740971 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522010-fmh2z" Feb 17 09:30:02 crc kubenswrapper[4848]: I0217 09:30:02.836707 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e13e335f-75a3-43a6-9bd8-7ea62595fa13-secret-volume\") pod \"e13e335f-75a3-43a6-9bd8-7ea62595fa13\" (UID: \"e13e335f-75a3-43a6-9bd8-7ea62595fa13\") " Feb 17 09:30:02 crc kubenswrapper[4848]: I0217 09:30:02.836809 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e13e335f-75a3-43a6-9bd8-7ea62595fa13-config-volume\") pod \"e13e335f-75a3-43a6-9bd8-7ea62595fa13\" (UID: \"e13e335f-75a3-43a6-9bd8-7ea62595fa13\") " Feb 17 09:30:02 crc kubenswrapper[4848]: I0217 09:30:02.836884 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c2p4\" (UniqueName: \"kubernetes.io/projected/e13e335f-75a3-43a6-9bd8-7ea62595fa13-kube-api-access-6c2p4\") pod \"e13e335f-75a3-43a6-9bd8-7ea62595fa13\" (UID: \"e13e335f-75a3-43a6-9bd8-7ea62595fa13\") " Feb 17 09:30:02 crc kubenswrapper[4848]: I0217 09:30:02.838350 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e13e335f-75a3-43a6-9bd8-7ea62595fa13-config-volume" (OuterVolumeSpecName: "config-volume") pod "e13e335f-75a3-43a6-9bd8-7ea62595fa13" (UID: "e13e335f-75a3-43a6-9bd8-7ea62595fa13"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:30:02 crc kubenswrapper[4848]: I0217 09:30:02.841959 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e13e335f-75a3-43a6-9bd8-7ea62595fa13-kube-api-access-6c2p4" (OuterVolumeSpecName: "kube-api-access-6c2p4") pod "e13e335f-75a3-43a6-9bd8-7ea62595fa13" (UID: "e13e335f-75a3-43a6-9bd8-7ea62595fa13"). InnerVolumeSpecName "kube-api-access-6c2p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:30:02 crc kubenswrapper[4848]: I0217 09:30:02.841983 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e13e335f-75a3-43a6-9bd8-7ea62595fa13-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e13e335f-75a3-43a6-9bd8-7ea62595fa13" (UID: "e13e335f-75a3-43a6-9bd8-7ea62595fa13"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:30:02 crc kubenswrapper[4848]: I0217 09:30:02.939045 4848 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e13e335f-75a3-43a6-9bd8-7ea62595fa13-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 09:30:02 crc kubenswrapper[4848]: I0217 09:30:02.939084 4848 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e13e335f-75a3-43a6-9bd8-7ea62595fa13-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 09:30:02 crc kubenswrapper[4848]: I0217 09:30:02.939095 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c2p4\" (UniqueName: \"kubernetes.io/projected/e13e335f-75a3-43a6-9bd8-7ea62595fa13-kube-api-access-6c2p4\") on node \"crc\" DevicePath \"\"" Feb 17 09:30:03 crc kubenswrapper[4848]: I0217 09:30:03.345843 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522010-fmh2z" event={"ID":"e13e335f-75a3-43a6-9bd8-7ea62595fa13","Type":"ContainerDied","Data":"b5c5fa586132511644b9215976a5dba5fd6df6e0f30a1d5bf20c2c577d36940e"} Feb 17 09:30:03 crc kubenswrapper[4848]: I0217 09:30:03.346189 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5c5fa586132511644b9215976a5dba5fd6df6e0f30a1d5bf20c2c577d36940e" Feb 17 09:30:03 crc kubenswrapper[4848]: I0217 09:30:03.345909 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522010-fmh2z" Feb 17 09:30:26 crc kubenswrapper[4848]: I0217 09:30:26.187841 4848 scope.go:117] "RemoveContainer" containerID="858a90c2d08c5073a90808dff41266276dc577d8e48dbf8b9888ce027e11462d" Feb 17 09:30:26 crc kubenswrapper[4848]: I0217 09:30:26.227448 4848 scope.go:117] "RemoveContainer" containerID="602cd7a4b20ac413d3192147119f005fe6c9216e11594610180690750bd6da1b" Feb 17 09:30:32 crc kubenswrapper[4848]: I0217 09:30:32.045284 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-6ee8-account-create-update-w4zgx"] Feb 17 09:30:32 crc kubenswrapper[4848]: I0217 09:30:32.060939 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-p7czj"] Feb 17 09:30:32 crc kubenswrapper[4848]: I0217 09:30:32.072162 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-p7czj"] Feb 17 09:30:32 crc kubenswrapper[4848]: I0217 09:30:32.082714 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-6ee8-account-create-update-w4zgx"] Feb 17 09:30:33 crc kubenswrapper[4848]: I0217 09:30:33.401746 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a0b47d8-84ea-4b51-a171-cb713122e873" path="/var/lib/kubelet/pods/2a0b47d8-84ea-4b51-a171-cb713122e873/volumes" Feb 17 09:30:33 crc kubenswrapper[4848]: I0217 09:30:33.402316 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbe13273-ef85-4246-99bf-cb85a278c25d" path="/var/lib/kubelet/pods/bbe13273-ef85-4246-99bf-cb85a278c25d/volumes" Feb 17 09:30:38 crc kubenswrapper[4848]: I0217 09:30:38.731314 4848 generic.go:334] "Generic (PLEG): container finished" podID="502cc85d-fb83-4c34-825f-5aca6c880af7" containerID="4f475882137f4f3d05ac713efdca46a33b0e1b710bff6150e9d6e732655b8532" exitCode=0 Feb 17 09:30:38 crc kubenswrapper[4848]: I0217 09:30:38.731390 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nztrh" event={"ID":"502cc85d-fb83-4c34-825f-5aca6c880af7","Type":"ContainerDied","Data":"4f475882137f4f3d05ac713efdca46a33b0e1b710bff6150e9d6e732655b8532"} Feb 17 09:30:39 crc kubenswrapper[4848]: I0217 09:30:39.034577 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-207e-account-create-update-ljmx9"] Feb 17 09:30:39 crc kubenswrapper[4848]: I0217 09:30:39.043050 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-e78a-account-create-update-rbzcs"] Feb 17 09:30:39 crc kubenswrapper[4848]: I0217 09:30:39.050539 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-w29sg"] Feb 17 09:30:39 crc kubenswrapper[4848]: I0217 09:30:39.058306 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-207e-account-create-update-ljmx9"] Feb 17 09:30:39 crc kubenswrapper[4848]: I0217 09:30:39.069718 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-e78a-account-create-update-rbzcs"] Feb 17 09:30:39 crc kubenswrapper[4848]: I0217 09:30:39.078055 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-wp62q"] Feb 17 09:30:39 crc kubenswrapper[4848]: I0217 09:30:39.086048 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-w29sg"] Feb 17 09:30:39 crc kubenswrapper[4848]: I0217 09:30:39.093728 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-wp62q"] Feb 17 09:30:39 crc kubenswrapper[4848]: I0217 09:30:39.394088 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1009b7c4-b07d-4be4-87a2-d82f1286dfc9" path="/var/lib/kubelet/pods/1009b7c4-b07d-4be4-87a2-d82f1286dfc9/volumes" Feb 17 09:30:39 crc kubenswrapper[4848]: I0217 09:30:39.394643 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="858540f4-3108-4527-bf4f-d163b4f2c66f" path="/var/lib/kubelet/pods/858540f4-3108-4527-bf4f-d163b4f2c66f/volumes" Feb 17 09:30:39 crc kubenswrapper[4848]: I0217 09:30:39.395198 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e60dd67a-6939-45a6-97f1-4f2e54bc4ca7" path="/var/lib/kubelet/pods/e60dd67a-6939-45a6-97f1-4f2e54bc4ca7/volumes" Feb 17 09:30:39 crc kubenswrapper[4848]: I0217 09:30:39.395745 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6eaa43b-c60c-415b-9efe-87f15bea768e" path="/var/lib/kubelet/pods/e6eaa43b-c60c-415b-9efe-87f15bea768e/volumes" Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.155024 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nztrh" Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.210217 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/502cc85d-fb83-4c34-825f-5aca6c880af7-ssh-key-openstack-edpm-ipam\") pod \"502cc85d-fb83-4c34-825f-5aca6c880af7\" (UID: \"502cc85d-fb83-4c34-825f-5aca6c880af7\") " Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.210300 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mwmh\" (UniqueName: \"kubernetes.io/projected/502cc85d-fb83-4c34-825f-5aca6c880af7-kube-api-access-6mwmh\") pod \"502cc85d-fb83-4c34-825f-5aca6c880af7\" (UID: \"502cc85d-fb83-4c34-825f-5aca6c880af7\") " Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.210350 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/502cc85d-fb83-4c34-825f-5aca6c880af7-inventory\") pod \"502cc85d-fb83-4c34-825f-5aca6c880af7\" (UID: \"502cc85d-fb83-4c34-825f-5aca6c880af7\") " Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.216540 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/502cc85d-fb83-4c34-825f-5aca6c880af7-kube-api-access-6mwmh" (OuterVolumeSpecName: "kube-api-access-6mwmh") pod "502cc85d-fb83-4c34-825f-5aca6c880af7" (UID: "502cc85d-fb83-4c34-825f-5aca6c880af7"). InnerVolumeSpecName "kube-api-access-6mwmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.265330 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/502cc85d-fb83-4c34-825f-5aca6c880af7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "502cc85d-fb83-4c34-825f-5aca6c880af7" (UID: "502cc85d-fb83-4c34-825f-5aca6c880af7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.267001 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/502cc85d-fb83-4c34-825f-5aca6c880af7-inventory" (OuterVolumeSpecName: "inventory") pod "502cc85d-fb83-4c34-825f-5aca6c880af7" (UID: "502cc85d-fb83-4c34-825f-5aca6c880af7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.312407 4848 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/502cc85d-fb83-4c34-825f-5aca6c880af7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.312443 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mwmh\" (UniqueName: \"kubernetes.io/projected/502cc85d-fb83-4c34-825f-5aca6c880af7-kube-api-access-6mwmh\") on node \"crc\" DevicePath \"\"" Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.312454 4848 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/502cc85d-fb83-4c34-825f-5aca6c880af7-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.759255 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nztrh" event={"ID":"502cc85d-fb83-4c34-825f-5aca6c880af7","Type":"ContainerDied","Data":"70096ac1879516f49f3e95dffaa5e264bee090a37decf6ca73da297bf2d3d919"} Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.759302 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70096ac1879516f49f3e95dffaa5e264bee090a37decf6ca73da297bf2d3d919" Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.759365 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-nztrh" Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.843750 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv"] Feb 17 09:30:40 crc kubenswrapper[4848]: E0217 09:30:40.844208 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e13e335f-75a3-43a6-9bd8-7ea62595fa13" containerName="collect-profiles" Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.844228 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="e13e335f-75a3-43a6-9bd8-7ea62595fa13" containerName="collect-profiles" Feb 17 09:30:40 crc kubenswrapper[4848]: E0217 09:30:40.844254 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="502cc85d-fb83-4c34-825f-5aca6c880af7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.844264 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="502cc85d-fb83-4c34-825f-5aca6c880af7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.844476 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="502cc85d-fb83-4c34-825f-5aca6c880af7" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.844519 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="e13e335f-75a3-43a6-9bd8-7ea62595fa13" containerName="collect-profiles" Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.846142 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv" Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.850416 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.851570 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-df2nq" Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.852221 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.852310 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.863601 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv"] Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.924142 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c1fceab-33b4-4eee-8e26-c9bc2a35f018-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv\" (UID: \"9c1fceab-33b4-4eee-8e26-c9bc2a35f018\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv" Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.924382 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c1fceab-33b4-4eee-8e26-c9bc2a35f018-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv\" (UID: \"9c1fceab-33b4-4eee-8e26-c9bc2a35f018\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv" Feb 17 09:30:40 crc kubenswrapper[4848]: I0217 09:30:40.924523 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9j9x\" (UniqueName: \"kubernetes.io/projected/9c1fceab-33b4-4eee-8e26-c9bc2a35f018-kube-api-access-l9j9x\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv\" (UID: \"9c1fceab-33b4-4eee-8e26-c9bc2a35f018\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv" Feb 17 09:30:41 crc kubenswrapper[4848]: I0217 09:30:41.026124 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c1fceab-33b4-4eee-8e26-c9bc2a35f018-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv\" (UID: \"9c1fceab-33b4-4eee-8e26-c9bc2a35f018\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv" Feb 17 09:30:41 crc kubenswrapper[4848]: I0217 09:30:41.026187 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c1fceab-33b4-4eee-8e26-c9bc2a35f018-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv\" (UID: \"9c1fceab-33b4-4eee-8e26-c9bc2a35f018\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv" Feb 17 09:30:41 crc kubenswrapper[4848]: I0217 09:30:41.026218 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9j9x\" (UniqueName: \"kubernetes.io/projected/9c1fceab-33b4-4eee-8e26-c9bc2a35f018-kube-api-access-l9j9x\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv\" (UID: \"9c1fceab-33b4-4eee-8e26-c9bc2a35f018\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv" Feb 17 09:30:41 crc kubenswrapper[4848]: I0217 09:30:41.031370 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c1fceab-33b4-4eee-8e26-c9bc2a35f018-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv\" (UID: \"9c1fceab-33b4-4eee-8e26-c9bc2a35f018\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv" Feb 17 09:30:41 crc kubenswrapper[4848]: I0217 09:30:41.031639 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c1fceab-33b4-4eee-8e26-c9bc2a35f018-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv\" (UID: \"9c1fceab-33b4-4eee-8e26-c9bc2a35f018\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv" Feb 17 09:30:41 crc kubenswrapper[4848]: I0217 09:30:41.041247 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9j9x\" (UniqueName: \"kubernetes.io/projected/9c1fceab-33b4-4eee-8e26-c9bc2a35f018-kube-api-access-l9j9x\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv\" (UID: \"9c1fceab-33b4-4eee-8e26-c9bc2a35f018\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv" Feb 17 09:30:41 crc kubenswrapper[4848]: I0217 09:30:41.206549 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv" Feb 17 09:30:41 crc kubenswrapper[4848]: I0217 09:30:41.764518 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv"] Feb 17 09:30:42 crc kubenswrapper[4848]: I0217 09:30:42.793340 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv" event={"ID":"9c1fceab-33b4-4eee-8e26-c9bc2a35f018","Type":"ContainerStarted","Data":"4d813db4fd2060bb1e9f9c565c9a4ca40ef9d7dde21d80f2c020a6c1c1f25154"} Feb 17 09:30:42 crc kubenswrapper[4848]: I0217 09:30:42.793750 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv" event={"ID":"9c1fceab-33b4-4eee-8e26-c9bc2a35f018","Type":"ContainerStarted","Data":"7cf4b2b87153cf9fb6f58adf2e80a10b15bba91d4ee087d98fa149102460aee0"} Feb 17 09:30:42 crc kubenswrapper[4848]: I0217 09:30:42.818293 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv" podStartSLOduration=2.17679261 podStartE2EDuration="2.818265652s" podCreationTimestamp="2026-02-17 09:30:40 +0000 UTC" firstStartedPulling="2026-02-17 09:30:41.791515794 +0000 UTC m=+1519.334771440" lastFinishedPulling="2026-02-17 09:30:42.432988836 +0000 UTC m=+1519.976244482" observedRunningTime="2026-02-17 09:30:42.808925845 +0000 UTC m=+1520.352181511" watchObservedRunningTime="2026-02-17 09:30:42.818265652 +0000 UTC m=+1520.361521298" Feb 17 09:30:52 crc kubenswrapper[4848]: I0217 09:30:52.036973 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-884bt"] Feb 17 09:30:52 crc kubenswrapper[4848]: I0217 09:30:52.045395 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-884bt"] Feb 17 09:30:53 crc kubenswrapper[4848]: I0217 09:30:53.396938 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4958099b-fe10-4fd4-abaa-00d1520eda93" path="/var/lib/kubelet/pods/4958099b-fe10-4fd4-abaa-00d1520eda93/volumes" Feb 17 09:31:00 crc kubenswrapper[4848]: I0217 09:31:00.080174 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-5f5pj"] Feb 17 09:31:00 crc kubenswrapper[4848]: I0217 09:31:00.088951 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-5f5pj"] Feb 17 09:31:01 crc kubenswrapper[4848]: I0217 09:31:01.403273 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed29dc41-db30-4792-8518-4ef61f232734" path="/var/lib/kubelet/pods/ed29dc41-db30-4792-8518-4ef61f232734/volumes" Feb 17 09:31:02 crc kubenswrapper[4848]: I0217 09:31:02.428571 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-c6t9p"] Feb 17 09:31:02 crc kubenswrapper[4848]: I0217 09:31:02.431348 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c6t9p" Feb 17 09:31:02 crc kubenswrapper[4848]: I0217 09:31:02.439279 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c6t9p"] Feb 17 09:31:02 crc kubenswrapper[4848]: I0217 09:31:02.468842 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ce6541f-ed6a-4515-bcc6-07203c9e7067-utilities\") pod \"community-operators-c6t9p\" (UID: \"8ce6541f-ed6a-4515-bcc6-07203c9e7067\") " pod="openshift-marketplace/community-operators-c6t9p" Feb 17 09:31:02 crc kubenswrapper[4848]: I0217 09:31:02.468896 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8q6v\" (UniqueName: \"kubernetes.io/projected/8ce6541f-ed6a-4515-bcc6-07203c9e7067-kube-api-access-b8q6v\") pod \"community-operators-c6t9p\" (UID: \"8ce6541f-ed6a-4515-bcc6-07203c9e7067\") " pod="openshift-marketplace/community-operators-c6t9p" Feb 17 09:31:02 crc kubenswrapper[4848]: I0217 09:31:02.469237 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ce6541f-ed6a-4515-bcc6-07203c9e7067-catalog-content\") pod \"community-operators-c6t9p\" (UID: \"8ce6541f-ed6a-4515-bcc6-07203c9e7067\") " pod="openshift-marketplace/community-operators-c6t9p" Feb 17 09:31:02 crc kubenswrapper[4848]: I0217 09:31:02.571241 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ce6541f-ed6a-4515-bcc6-07203c9e7067-catalog-content\") pod \"community-operators-c6t9p\" (UID: \"8ce6541f-ed6a-4515-bcc6-07203c9e7067\") " pod="openshift-marketplace/community-operators-c6t9p" Feb 17 09:31:02 crc kubenswrapper[4848]: I0217 09:31:02.571334 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ce6541f-ed6a-4515-bcc6-07203c9e7067-utilities\") pod \"community-operators-c6t9p\" (UID: \"8ce6541f-ed6a-4515-bcc6-07203c9e7067\") " pod="openshift-marketplace/community-operators-c6t9p" Feb 17 09:31:02 crc kubenswrapper[4848]: I0217 09:31:02.571382 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8q6v\" (UniqueName: \"kubernetes.io/projected/8ce6541f-ed6a-4515-bcc6-07203c9e7067-kube-api-access-b8q6v\") pod \"community-operators-c6t9p\" (UID: \"8ce6541f-ed6a-4515-bcc6-07203c9e7067\") " pod="openshift-marketplace/community-operators-c6t9p" Feb 17 09:31:02 crc kubenswrapper[4848]: I0217 09:31:02.572006 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ce6541f-ed6a-4515-bcc6-07203c9e7067-catalog-content\") pod \"community-operators-c6t9p\" (UID: \"8ce6541f-ed6a-4515-bcc6-07203c9e7067\") " pod="openshift-marketplace/community-operators-c6t9p" Feb 17 09:31:02 crc kubenswrapper[4848]: I0217 09:31:02.572006 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ce6541f-ed6a-4515-bcc6-07203c9e7067-utilities\") pod \"community-operators-c6t9p\" (UID: \"8ce6541f-ed6a-4515-bcc6-07203c9e7067\") " pod="openshift-marketplace/community-operators-c6t9p" Feb 17 09:31:02 crc kubenswrapper[4848]: I0217 09:31:02.610821 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8q6v\" (UniqueName: \"kubernetes.io/projected/8ce6541f-ed6a-4515-bcc6-07203c9e7067-kube-api-access-b8q6v\") pod \"community-operators-c6t9p\" (UID: \"8ce6541f-ed6a-4515-bcc6-07203c9e7067\") " pod="openshift-marketplace/community-operators-c6t9p" Feb 17 09:31:02 crc kubenswrapper[4848]: I0217 09:31:02.765069 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c6t9p" Feb 17 09:31:03 crc kubenswrapper[4848]: I0217 09:31:03.291817 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-c6t9p"] Feb 17 09:31:04 crc kubenswrapper[4848]: I0217 09:31:04.029138 4848 generic.go:334] "Generic (PLEG): container finished" podID="8ce6541f-ed6a-4515-bcc6-07203c9e7067" containerID="88ec21e2072c497595c955f8e6929afdf7d0124af4c25b9899588ba3358ed939" exitCode=0 Feb 17 09:31:04 crc kubenswrapper[4848]: I0217 09:31:04.029202 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c6t9p" event={"ID":"8ce6541f-ed6a-4515-bcc6-07203c9e7067","Type":"ContainerDied","Data":"88ec21e2072c497595c955f8e6929afdf7d0124af4c25b9899588ba3358ed939"} Feb 17 09:31:04 crc kubenswrapper[4848]: I0217 09:31:04.030061 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c6t9p" event={"ID":"8ce6541f-ed6a-4515-bcc6-07203c9e7067","Type":"ContainerStarted","Data":"46a6c9f61098cbd56cfe52aa1339a6a982ff0976e42a1a6653eb4f9a4221f2d8"} Feb 17 09:31:04 crc kubenswrapper[4848]: I0217 09:31:04.031254 4848 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 09:31:07 crc kubenswrapper[4848]: I0217 09:31:07.066904 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c6t9p" event={"ID":"8ce6541f-ed6a-4515-bcc6-07203c9e7067","Type":"ContainerStarted","Data":"4623dbc77597943f54ba5f3118af6fb6438e1b1b46d45a81ba873be1174cbda0"} Feb 17 09:31:08 crc kubenswrapper[4848]: I0217 09:31:08.080673 4848 generic.go:334] "Generic (PLEG): container finished" podID="8ce6541f-ed6a-4515-bcc6-07203c9e7067" containerID="4623dbc77597943f54ba5f3118af6fb6438e1b1b46d45a81ba873be1174cbda0" exitCode=0 Feb 17 09:31:08 crc kubenswrapper[4848]: I0217 09:31:08.080823 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c6t9p" event={"ID":"8ce6541f-ed6a-4515-bcc6-07203c9e7067","Type":"ContainerDied","Data":"4623dbc77597943f54ba5f3118af6fb6438e1b1b46d45a81ba873be1174cbda0"} Feb 17 09:31:09 crc kubenswrapper[4848]: I0217 09:31:09.094433 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c6t9p" event={"ID":"8ce6541f-ed6a-4515-bcc6-07203c9e7067","Type":"ContainerStarted","Data":"315dc8abf1c02b1b0fa99952b8f203eb5c2cdcc2e6023cdc6d3b57ea05236f9c"} Feb 17 09:31:09 crc kubenswrapper[4848]: I0217 09:31:09.133032 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-c6t9p" podStartSLOduration=2.676325507 podStartE2EDuration="7.133009276s" podCreationTimestamp="2026-02-17 09:31:02 +0000 UTC" firstStartedPulling="2026-02-17 09:31:04.030951604 +0000 UTC m=+1541.574207250" lastFinishedPulling="2026-02-17 09:31:08.487635373 +0000 UTC m=+1546.030891019" observedRunningTime="2026-02-17 09:31:09.119351236 +0000 UTC m=+1546.662606932" watchObservedRunningTime="2026-02-17 09:31:09.133009276 +0000 UTC m=+1546.676264932" Feb 17 09:31:12 crc kubenswrapper[4848]: I0217 09:31:12.765849 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-c6t9p" Feb 17 09:31:12 crc kubenswrapper[4848]: I0217 09:31:12.766436 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-c6t9p" Feb 17 09:31:12 crc kubenswrapper[4848]: I0217 09:31:12.811662 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-c6t9p" Feb 17 09:31:13 crc kubenswrapper[4848]: I0217 09:31:13.200364 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-c6t9p" Feb 17 09:31:13 crc kubenswrapper[4848]: I0217 09:31:13.268445 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c6t9p"] Feb 17 09:31:15 crc kubenswrapper[4848]: I0217 09:31:15.152142 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-c6t9p" podUID="8ce6541f-ed6a-4515-bcc6-07203c9e7067" containerName="registry-server" containerID="cri-o://315dc8abf1c02b1b0fa99952b8f203eb5c2cdcc2e6023cdc6d3b57ea05236f9c" gracePeriod=2 Feb 17 09:31:15 crc kubenswrapper[4848]: I0217 09:31:15.651620 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c6t9p" Feb 17 09:31:15 crc kubenswrapper[4848]: I0217 09:31:15.755179 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8q6v\" (UniqueName: \"kubernetes.io/projected/8ce6541f-ed6a-4515-bcc6-07203c9e7067-kube-api-access-b8q6v\") pod \"8ce6541f-ed6a-4515-bcc6-07203c9e7067\" (UID: \"8ce6541f-ed6a-4515-bcc6-07203c9e7067\") " Feb 17 09:31:15 crc kubenswrapper[4848]: I0217 09:31:15.755238 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ce6541f-ed6a-4515-bcc6-07203c9e7067-catalog-content\") pod \"8ce6541f-ed6a-4515-bcc6-07203c9e7067\" (UID: \"8ce6541f-ed6a-4515-bcc6-07203c9e7067\") " Feb 17 09:31:15 crc kubenswrapper[4848]: I0217 09:31:15.755351 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ce6541f-ed6a-4515-bcc6-07203c9e7067-utilities\") pod \"8ce6541f-ed6a-4515-bcc6-07203c9e7067\" (UID: \"8ce6541f-ed6a-4515-bcc6-07203c9e7067\") " Feb 17 09:31:15 crc kubenswrapper[4848]: I0217 09:31:15.756529 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ce6541f-ed6a-4515-bcc6-07203c9e7067-utilities" (OuterVolumeSpecName: "utilities") pod "8ce6541f-ed6a-4515-bcc6-07203c9e7067" (UID: "8ce6541f-ed6a-4515-bcc6-07203c9e7067"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:31:15 crc kubenswrapper[4848]: I0217 09:31:15.762930 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ce6541f-ed6a-4515-bcc6-07203c9e7067-kube-api-access-b8q6v" (OuterVolumeSpecName: "kube-api-access-b8q6v") pod "8ce6541f-ed6a-4515-bcc6-07203c9e7067" (UID: "8ce6541f-ed6a-4515-bcc6-07203c9e7067"). InnerVolumeSpecName "kube-api-access-b8q6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:31:15 crc kubenswrapper[4848]: I0217 09:31:15.819506 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ce6541f-ed6a-4515-bcc6-07203c9e7067-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ce6541f-ed6a-4515-bcc6-07203c9e7067" (UID: "8ce6541f-ed6a-4515-bcc6-07203c9e7067"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:31:15 crc kubenswrapper[4848]: I0217 09:31:15.858005 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ce6541f-ed6a-4515-bcc6-07203c9e7067-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:31:15 crc kubenswrapper[4848]: I0217 09:31:15.858060 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8q6v\" (UniqueName: \"kubernetes.io/projected/8ce6541f-ed6a-4515-bcc6-07203c9e7067-kube-api-access-b8q6v\") on node \"crc\" DevicePath \"\"" Feb 17 09:31:15 crc kubenswrapper[4848]: I0217 09:31:15.858080 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ce6541f-ed6a-4515-bcc6-07203c9e7067-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:31:16 crc kubenswrapper[4848]: I0217 09:31:16.167931 4848 generic.go:334] "Generic (PLEG): container finished" podID="8ce6541f-ed6a-4515-bcc6-07203c9e7067" containerID="315dc8abf1c02b1b0fa99952b8f203eb5c2cdcc2e6023cdc6d3b57ea05236f9c" exitCode=0 Feb 17 09:31:16 crc kubenswrapper[4848]: I0217 09:31:16.167973 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c6t9p" event={"ID":"8ce6541f-ed6a-4515-bcc6-07203c9e7067","Type":"ContainerDied","Data":"315dc8abf1c02b1b0fa99952b8f203eb5c2cdcc2e6023cdc6d3b57ea05236f9c"} Feb 17 09:31:16 crc kubenswrapper[4848]: I0217 09:31:16.168885 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-c6t9p" event={"ID":"8ce6541f-ed6a-4515-bcc6-07203c9e7067","Type":"ContainerDied","Data":"46a6c9f61098cbd56cfe52aa1339a6a982ff0976e42a1a6653eb4f9a4221f2d8"} Feb 17 09:31:16 crc kubenswrapper[4848]: I0217 09:31:16.168912 4848 scope.go:117] "RemoveContainer" containerID="315dc8abf1c02b1b0fa99952b8f203eb5c2cdcc2e6023cdc6d3b57ea05236f9c" Feb 17 09:31:16 crc kubenswrapper[4848]: I0217 09:31:16.168015 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-c6t9p" Feb 17 09:31:16 crc kubenswrapper[4848]: I0217 09:31:16.212637 4848 scope.go:117] "RemoveContainer" containerID="4623dbc77597943f54ba5f3118af6fb6438e1b1b46d45a81ba873be1174cbda0" Feb 17 09:31:16 crc kubenswrapper[4848]: I0217 09:31:16.213659 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-c6t9p"] Feb 17 09:31:16 crc kubenswrapper[4848]: I0217 09:31:16.238398 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-c6t9p"] Feb 17 09:31:16 crc kubenswrapper[4848]: I0217 09:31:16.253803 4848 scope.go:117] "RemoveContainer" containerID="88ec21e2072c497595c955f8e6929afdf7d0124af4c25b9899588ba3358ed939" Feb 17 09:31:16 crc kubenswrapper[4848]: I0217 09:31:16.282675 4848 scope.go:117] "RemoveContainer" containerID="315dc8abf1c02b1b0fa99952b8f203eb5c2cdcc2e6023cdc6d3b57ea05236f9c" Feb 17 09:31:16 crc kubenswrapper[4848]: E0217 09:31:16.283133 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"315dc8abf1c02b1b0fa99952b8f203eb5c2cdcc2e6023cdc6d3b57ea05236f9c\": container with ID starting with 315dc8abf1c02b1b0fa99952b8f203eb5c2cdcc2e6023cdc6d3b57ea05236f9c not found: ID does not exist" containerID="315dc8abf1c02b1b0fa99952b8f203eb5c2cdcc2e6023cdc6d3b57ea05236f9c" Feb 17 09:31:16 crc kubenswrapper[4848]: I0217 09:31:16.283183 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"315dc8abf1c02b1b0fa99952b8f203eb5c2cdcc2e6023cdc6d3b57ea05236f9c"} err="failed to get container status \"315dc8abf1c02b1b0fa99952b8f203eb5c2cdcc2e6023cdc6d3b57ea05236f9c\": rpc error: code = NotFound desc = could not find container \"315dc8abf1c02b1b0fa99952b8f203eb5c2cdcc2e6023cdc6d3b57ea05236f9c\": container with ID starting with 315dc8abf1c02b1b0fa99952b8f203eb5c2cdcc2e6023cdc6d3b57ea05236f9c not found: ID does not exist" Feb 17 09:31:16 crc kubenswrapper[4848]: I0217 09:31:16.283210 4848 scope.go:117] "RemoveContainer" containerID="4623dbc77597943f54ba5f3118af6fb6438e1b1b46d45a81ba873be1174cbda0" Feb 17 09:31:16 crc kubenswrapper[4848]: E0217 09:31:16.283541 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4623dbc77597943f54ba5f3118af6fb6438e1b1b46d45a81ba873be1174cbda0\": container with ID starting with 4623dbc77597943f54ba5f3118af6fb6438e1b1b46d45a81ba873be1174cbda0 not found: ID does not exist" containerID="4623dbc77597943f54ba5f3118af6fb6438e1b1b46d45a81ba873be1174cbda0" Feb 17 09:31:16 crc kubenswrapper[4848]: I0217 09:31:16.283593 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4623dbc77597943f54ba5f3118af6fb6438e1b1b46d45a81ba873be1174cbda0"} err="failed to get container status \"4623dbc77597943f54ba5f3118af6fb6438e1b1b46d45a81ba873be1174cbda0\": rpc error: code = NotFound desc = could not find container \"4623dbc77597943f54ba5f3118af6fb6438e1b1b46d45a81ba873be1174cbda0\": container with ID starting with 4623dbc77597943f54ba5f3118af6fb6438e1b1b46d45a81ba873be1174cbda0 not found: ID does not exist" Feb 17 09:31:16 crc kubenswrapper[4848]: I0217 09:31:16.283633 4848 scope.go:117] "RemoveContainer" containerID="88ec21e2072c497595c955f8e6929afdf7d0124af4c25b9899588ba3358ed939" Feb 17 09:31:16 crc kubenswrapper[4848]: E0217 09:31:16.283937 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88ec21e2072c497595c955f8e6929afdf7d0124af4c25b9899588ba3358ed939\": container with ID starting with 88ec21e2072c497595c955f8e6929afdf7d0124af4c25b9899588ba3358ed939 not found: ID does not exist" containerID="88ec21e2072c497595c955f8e6929afdf7d0124af4c25b9899588ba3358ed939" Feb 17 09:31:16 crc kubenswrapper[4848]: I0217 09:31:16.283969 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88ec21e2072c497595c955f8e6929afdf7d0124af4c25b9899588ba3358ed939"} err="failed to get container status \"88ec21e2072c497595c955f8e6929afdf7d0124af4c25b9899588ba3358ed939\": rpc error: code = NotFound desc = could not find container \"88ec21e2072c497595c955f8e6929afdf7d0124af4c25b9899588ba3358ed939\": container with ID starting with 88ec21e2072c497595c955f8e6929afdf7d0124af4c25b9899588ba3358ed939 not found: ID does not exist" Feb 17 09:31:17 crc kubenswrapper[4848]: I0217 09:31:17.053603 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-030c-account-create-update-zhd9p"] Feb 17 09:31:17 crc kubenswrapper[4848]: I0217 09:31:17.070361 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-ffrb6"] Feb 17 09:31:17 crc kubenswrapper[4848]: I0217 09:31:17.080275 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-4s7cd"] Feb 17 09:31:17 crc kubenswrapper[4848]: I0217 09:31:17.092022 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-ffrb6"] Feb 17 09:31:17 crc kubenswrapper[4848]: I0217 09:31:17.100943 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-4s7cd"] Feb 17 09:31:17 crc kubenswrapper[4848]: I0217 09:31:17.107968 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-s9chz"] Feb 17 09:31:17 crc kubenswrapper[4848]: I0217 09:31:17.130856 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-c50b-account-create-update-lvn7w"] Feb 17 09:31:17 crc kubenswrapper[4848]: I0217 09:31:17.142225 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-605d-account-create-update-lscpt"] Feb 17 09:31:17 crc kubenswrapper[4848]: I0217 09:31:17.149243 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-030c-account-create-update-zhd9p"] Feb 17 09:31:17 crc kubenswrapper[4848]: I0217 09:31:17.156935 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-s9chz"] Feb 17 09:31:17 crc kubenswrapper[4848]: I0217 09:31:17.165034 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-605d-account-create-update-lscpt"] Feb 17 09:31:17 crc kubenswrapper[4848]: I0217 09:31:17.173893 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-c50b-account-create-update-lvn7w"] Feb 17 09:31:17 crc kubenswrapper[4848]: I0217 09:31:17.403389 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04dc10b3-8edf-4385-bdfa-24b322f8355e" path="/var/lib/kubelet/pods/04dc10b3-8edf-4385-bdfa-24b322f8355e/volumes" Feb 17 09:31:17 crc kubenswrapper[4848]: I0217 09:31:17.405025 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="272b2170-d012-47c9-9d08-1a696ef88165" path="/var/lib/kubelet/pods/272b2170-d012-47c9-9d08-1a696ef88165/volumes" Feb 17 09:31:17 crc kubenswrapper[4848]: I0217 09:31:17.406574 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a4b82e2-58b0-4b38-b7db-6882298598c4" path="/var/lib/kubelet/pods/4a4b82e2-58b0-4b38-b7db-6882298598c4/volumes" Feb 17 09:31:17 crc kubenswrapper[4848]: I0217 09:31:17.411937 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82b8b5f5-7a6c-4a94-9255-510bd8ac99a3" path="/var/lib/kubelet/pods/82b8b5f5-7a6c-4a94-9255-510bd8ac99a3/volumes" Feb 17 09:31:17 crc kubenswrapper[4848]: I0217 09:31:17.412681 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ce6541f-ed6a-4515-bcc6-07203c9e7067" path="/var/lib/kubelet/pods/8ce6541f-ed6a-4515-bcc6-07203c9e7067/volumes" Feb 17 09:31:17 crc kubenswrapper[4848]: I0217 09:31:17.413569 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6bc83da-3773-4c42-95b7-56c77fd0fdb1" path="/var/lib/kubelet/pods/c6bc83da-3773-4c42-95b7-56c77fd0fdb1/volumes" Feb 17 09:31:17 crc kubenswrapper[4848]: I0217 09:31:17.414740 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00" path="/var/lib/kubelet/pods/e89d62bd-4a60-4bbc-a6d2-50cfb4d05a00/volumes" Feb 17 09:31:21 crc kubenswrapper[4848]: I0217 09:31:21.029864 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-jjhrd"] Feb 17 09:31:21 crc kubenswrapper[4848]: I0217 09:31:21.037039 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-jjhrd"] Feb 17 09:31:21 crc kubenswrapper[4848]: I0217 09:31:21.400680 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eab60454-0853-4ec5-ba88-78e220fab168" path="/var/lib/kubelet/pods/eab60454-0853-4ec5-ba88-78e220fab168/volumes" Feb 17 09:31:26 crc kubenswrapper[4848]: I0217 09:31:26.360405 4848 scope.go:117] "RemoveContainer" containerID="9f5736ebd81d8cf0a1c76a7ab5c957539f3ea45df0e0d5d2fafd0c53c488ece3" Feb 17 09:31:26 crc kubenswrapper[4848]: I0217 09:31:26.406517 4848 scope.go:117] "RemoveContainer" containerID="7133481f5a6206c7e497ea5e74f8de8c2602fbe71a5348186b90335e51516f40" Feb 17 09:31:26 crc kubenswrapper[4848]: I0217 09:31:26.473586 4848 scope.go:117] "RemoveContainer" containerID="befb432b3470e7a1531ee9e2e1091f5d81c97049f871b923862d0c1fe43807b3" Feb 17 09:31:26 crc kubenswrapper[4848]: I0217 09:31:26.526235 4848 scope.go:117] "RemoveContainer" containerID="946980ceb65edcae562802ae21a2cb294f5a034371daa6644585de205d1fc1b3" Feb 17 09:31:26 crc kubenswrapper[4848]: I0217 09:31:26.575473 4848 scope.go:117] "RemoveContainer" containerID="c96bc5281c3998f7bbcdded7e70ad475f607c37110ccef7975e872b7b223f6b9" Feb 17 09:31:26 crc kubenswrapper[4848]: I0217 09:31:26.600496 4848 scope.go:117] "RemoveContainer" containerID="252f99d92bd4bf0bb1e005048fd69d1f82f58219b2756de4285dcada4bdb9adc" Feb 17 09:31:26 crc kubenswrapper[4848]: I0217 09:31:26.658317 4848 scope.go:117] "RemoveContainer" containerID="a692e8999ccbddeb59800c874722b8e348dcb571dd3d19c772f852c0ea2c8425" Feb 17 09:31:26 crc kubenswrapper[4848]: I0217 09:31:26.680741 4848 scope.go:117] "RemoveContainer" containerID="7c761436a615b2cc970e711cc84889ba2c0ac2004d65c43913dee28aac4499d8" Feb 17 09:31:26 crc kubenswrapper[4848]: I0217 09:31:26.704846 4848 scope.go:117] "RemoveContainer" containerID="ce96dc17b10dd0adbf3d1534437111e4b5801c23ffe9f7d1bf9a7d9b75233630" Feb 17 09:31:26 crc kubenswrapper[4848]: I0217 09:31:26.735671 4848 scope.go:117] "RemoveContainer" containerID="1904915061b9f32e5e6e94a3f18a6812fbce876f6d896d2ccf4f2ad3f2b986f8" Feb 17 09:31:26 crc kubenswrapper[4848]: I0217 09:31:26.793194 4848 scope.go:117] "RemoveContainer" containerID="1ebd6ee74f3c2dd906a68a542dd080cd6e29e87511b6ed94d0faa2987b16f9e0" Feb 17 09:31:26 crc kubenswrapper[4848]: I0217 09:31:26.831143 4848 scope.go:117] "RemoveContainer" containerID="d212ef58fc2c276356a1df2d260d6e0cc739edb4585cfe26c2d093aa28ca4d07" Feb 17 09:31:26 crc kubenswrapper[4848]: I0217 09:31:26.865947 4848 scope.go:117] "RemoveContainer" containerID="d9a98dc4add23ca4f47299741110c817f5de22f5be6b3864588e30ccaace9359" Feb 17 09:31:26 crc kubenswrapper[4848]: I0217 09:31:26.909335 4848 scope.go:117] "RemoveContainer" containerID="c49b0a8baefdf1c945a466937de35cec98bb40ad145930dd6af1aa64c04531bb" Feb 17 09:31:26 crc kubenswrapper[4848]: I0217 09:31:26.944854 4848 scope.go:117] "RemoveContainer" containerID="86001ceea136307b0385e2ba29bfcab2f52608c6d2e715365152f8e5472e7f36" Feb 17 09:31:37 crc kubenswrapper[4848]: I0217 09:31:37.742322 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cdr5v"] Feb 17 09:31:37 crc kubenswrapper[4848]: E0217 09:31:37.743229 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ce6541f-ed6a-4515-bcc6-07203c9e7067" containerName="extract-utilities" Feb 17 09:31:37 crc kubenswrapper[4848]: I0217 09:31:37.743247 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ce6541f-ed6a-4515-bcc6-07203c9e7067" containerName="extract-utilities" Feb 17 09:31:37 crc kubenswrapper[4848]: E0217 09:31:37.743282 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ce6541f-ed6a-4515-bcc6-07203c9e7067" containerName="registry-server" Feb 17 09:31:37 crc kubenswrapper[4848]: I0217 09:31:37.743293 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ce6541f-ed6a-4515-bcc6-07203c9e7067" containerName="registry-server" Feb 17 09:31:37 crc kubenswrapper[4848]: E0217 09:31:37.743307 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ce6541f-ed6a-4515-bcc6-07203c9e7067" containerName="extract-content" Feb 17 09:31:37 crc kubenswrapper[4848]: I0217 09:31:37.743315 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ce6541f-ed6a-4515-bcc6-07203c9e7067" containerName="extract-content" Feb 17 09:31:37 crc kubenswrapper[4848]: I0217 09:31:37.743543 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ce6541f-ed6a-4515-bcc6-07203c9e7067" containerName="registry-server" Feb 17 09:31:37 crc kubenswrapper[4848]: I0217 09:31:37.745142 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cdr5v" Feb 17 09:31:37 crc kubenswrapper[4848]: I0217 09:31:37.754475 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cdr5v"] Feb 17 09:31:37 crc kubenswrapper[4848]: I0217 09:31:37.934048 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2pvd\" (UniqueName: \"kubernetes.io/projected/31a42093-0072-44a7-9357-58c67e413a4c-kube-api-access-t2pvd\") pod \"redhat-marketplace-cdr5v\" (UID: \"31a42093-0072-44a7-9357-58c67e413a4c\") " pod="openshift-marketplace/redhat-marketplace-cdr5v" Feb 17 09:31:37 crc kubenswrapper[4848]: I0217 09:31:37.934128 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31a42093-0072-44a7-9357-58c67e413a4c-utilities\") pod \"redhat-marketplace-cdr5v\" (UID: \"31a42093-0072-44a7-9357-58c67e413a4c\") " pod="openshift-marketplace/redhat-marketplace-cdr5v" Feb 17 09:31:37 crc kubenswrapper[4848]: I0217 09:31:37.934187 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31a42093-0072-44a7-9357-58c67e413a4c-catalog-content\") pod \"redhat-marketplace-cdr5v\" (UID: \"31a42093-0072-44a7-9357-58c67e413a4c\") " pod="openshift-marketplace/redhat-marketplace-cdr5v" Feb 17 09:31:38 crc kubenswrapper[4848]: I0217 09:31:38.036436 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2pvd\" (UniqueName: \"kubernetes.io/projected/31a42093-0072-44a7-9357-58c67e413a4c-kube-api-access-t2pvd\") pod \"redhat-marketplace-cdr5v\" (UID: \"31a42093-0072-44a7-9357-58c67e413a4c\") " pod="openshift-marketplace/redhat-marketplace-cdr5v" Feb 17 09:31:38 crc kubenswrapper[4848]: I0217 09:31:38.036532 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31a42093-0072-44a7-9357-58c67e413a4c-utilities\") pod \"redhat-marketplace-cdr5v\" (UID: \"31a42093-0072-44a7-9357-58c67e413a4c\") " pod="openshift-marketplace/redhat-marketplace-cdr5v" Feb 17 09:31:38 crc kubenswrapper[4848]: I0217 09:31:38.036565 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31a42093-0072-44a7-9357-58c67e413a4c-catalog-content\") pod \"redhat-marketplace-cdr5v\" (UID: \"31a42093-0072-44a7-9357-58c67e413a4c\") " pod="openshift-marketplace/redhat-marketplace-cdr5v" Feb 17 09:31:38 crc kubenswrapper[4848]: I0217 09:31:38.037160 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31a42093-0072-44a7-9357-58c67e413a4c-catalog-content\") pod \"redhat-marketplace-cdr5v\" (UID: \"31a42093-0072-44a7-9357-58c67e413a4c\") " pod="openshift-marketplace/redhat-marketplace-cdr5v" Feb 17 09:31:38 crc kubenswrapper[4848]: I0217 09:31:38.037170 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31a42093-0072-44a7-9357-58c67e413a4c-utilities\") pod \"redhat-marketplace-cdr5v\" (UID: \"31a42093-0072-44a7-9357-58c67e413a4c\") " pod="openshift-marketplace/redhat-marketplace-cdr5v" Feb 17 09:31:38 crc kubenswrapper[4848]: I0217 09:31:38.059577 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2pvd\" (UniqueName: \"kubernetes.io/projected/31a42093-0072-44a7-9357-58c67e413a4c-kube-api-access-t2pvd\") pod \"redhat-marketplace-cdr5v\" (UID: \"31a42093-0072-44a7-9357-58c67e413a4c\") " pod="openshift-marketplace/redhat-marketplace-cdr5v" Feb 17 09:31:38 crc kubenswrapper[4848]: I0217 09:31:38.075481 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cdr5v" Feb 17 09:31:38 crc kubenswrapper[4848]: I0217 09:31:38.542414 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cdr5v"] Feb 17 09:31:39 crc kubenswrapper[4848]: I0217 09:31:39.411646 4848 generic.go:334] "Generic (PLEG): container finished" podID="31a42093-0072-44a7-9357-58c67e413a4c" containerID="4060d0bd61b74303e7d6d15e10cfaa387d64ebf5f9dfa181572c8cf839294fec" exitCode=0 Feb 17 09:31:39 crc kubenswrapper[4848]: I0217 09:31:39.411776 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdr5v" event={"ID":"31a42093-0072-44a7-9357-58c67e413a4c","Type":"ContainerDied","Data":"4060d0bd61b74303e7d6d15e10cfaa387d64ebf5f9dfa181572c8cf839294fec"} Feb 17 09:31:39 crc kubenswrapper[4848]: I0217 09:31:39.411966 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdr5v" event={"ID":"31a42093-0072-44a7-9357-58c67e413a4c","Type":"ContainerStarted","Data":"078e3d810c7a1710c098c536c20a87134c097a9d3cbb99c315b136b89c40b0de"} Feb 17 09:31:40 crc kubenswrapper[4848]: I0217 09:31:40.430865 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdr5v" event={"ID":"31a42093-0072-44a7-9357-58c67e413a4c","Type":"ContainerStarted","Data":"add40db3899cb47e2e8ae125a281417662e59d232c97bdbe981e718d1173932b"} Feb 17 09:31:41 crc kubenswrapper[4848]: I0217 09:31:41.443409 4848 generic.go:334] "Generic (PLEG): container finished" podID="31a42093-0072-44a7-9357-58c67e413a4c" containerID="add40db3899cb47e2e8ae125a281417662e59d232c97bdbe981e718d1173932b" exitCode=0 Feb 17 09:31:41 crc kubenswrapper[4848]: I0217 09:31:41.443552 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdr5v" event={"ID":"31a42093-0072-44a7-9357-58c67e413a4c","Type":"ContainerDied","Data":"add40db3899cb47e2e8ae125a281417662e59d232c97bdbe981e718d1173932b"} Feb 17 09:31:42 crc kubenswrapper[4848]: I0217 09:31:42.454495 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdr5v" event={"ID":"31a42093-0072-44a7-9357-58c67e413a4c","Type":"ContainerStarted","Data":"ce2877837768ec2ae86d3c12911ada1a5fbce99555a6e64b5505a1518eb9152e"} Feb 17 09:31:42 crc kubenswrapper[4848]: I0217 09:31:42.490091 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-cdr5v" podStartSLOduration=3.005380148 podStartE2EDuration="5.490059812s" podCreationTimestamp="2026-02-17 09:31:37 +0000 UTC" firstStartedPulling="2026-02-17 09:31:39.413941557 +0000 UTC m=+1576.957197243" lastFinishedPulling="2026-02-17 09:31:41.898621251 +0000 UTC m=+1579.441876907" observedRunningTime="2026-02-17 09:31:42.479674805 +0000 UTC m=+1580.022930501" watchObservedRunningTime="2026-02-17 09:31:42.490059812 +0000 UTC m=+1580.033315498" Feb 17 09:31:48 crc kubenswrapper[4848]: I0217 09:31:48.076372 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-cdr5v" Feb 17 09:31:48 crc kubenswrapper[4848]: I0217 09:31:48.078454 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-cdr5v" Feb 17 09:31:48 crc kubenswrapper[4848]: I0217 09:31:48.129308 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-cdr5v" Feb 17 09:31:48 crc kubenswrapper[4848]: I0217 09:31:48.600861 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-cdr5v" Feb 17 09:31:48 crc kubenswrapper[4848]: I0217 09:31:48.679272 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cdr5v"] Feb 17 09:31:50 crc kubenswrapper[4848]: I0217 09:31:50.062644 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-zh2hd"] Feb 17 09:31:50 crc kubenswrapper[4848]: I0217 09:31:50.076159 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-zh2hd"] Feb 17 09:31:50 crc kubenswrapper[4848]: I0217 09:31:50.558739 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-cdr5v" podUID="31a42093-0072-44a7-9357-58c67e413a4c" containerName="registry-server" containerID="cri-o://ce2877837768ec2ae86d3c12911ada1a5fbce99555a6e64b5505a1518eb9152e" gracePeriod=2 Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.395886 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eaa8789-cc44-4571-a25b-b7a7f56668f8" path="/var/lib/kubelet/pods/6eaa8789-cc44-4571-a25b-b7a7f56668f8/volumes" Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.548896 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cdr5v" Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.577029 4848 generic.go:334] "Generic (PLEG): container finished" podID="9c1fceab-33b4-4eee-8e26-c9bc2a35f018" containerID="4d813db4fd2060bb1e9f9c565c9a4ca40ef9d7dde21d80f2c020a6c1c1f25154" exitCode=0 Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.577116 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv" event={"ID":"9c1fceab-33b4-4eee-8e26-c9bc2a35f018","Type":"ContainerDied","Data":"4d813db4fd2060bb1e9f9c565c9a4ca40ef9d7dde21d80f2c020a6c1c1f25154"} Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.580365 4848 generic.go:334] "Generic (PLEG): container finished" podID="31a42093-0072-44a7-9357-58c67e413a4c" containerID="ce2877837768ec2ae86d3c12911ada1a5fbce99555a6e64b5505a1518eb9152e" exitCode=0 Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.580401 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdr5v" event={"ID":"31a42093-0072-44a7-9357-58c67e413a4c","Type":"ContainerDied","Data":"ce2877837768ec2ae86d3c12911ada1a5fbce99555a6e64b5505a1518eb9152e"} Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.580426 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cdr5v" event={"ID":"31a42093-0072-44a7-9357-58c67e413a4c","Type":"ContainerDied","Data":"078e3d810c7a1710c098c536c20a87134c097a9d3cbb99c315b136b89c40b0de"} Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.580460 4848 scope.go:117] "RemoveContainer" containerID="ce2877837768ec2ae86d3c12911ada1a5fbce99555a6e64b5505a1518eb9152e" Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.580581 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cdr5v" Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.618268 4848 scope.go:117] "RemoveContainer" containerID="add40db3899cb47e2e8ae125a281417662e59d232c97bdbe981e718d1173932b" Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.647374 4848 scope.go:117] "RemoveContainer" containerID="4060d0bd61b74303e7d6d15e10cfaa387d64ebf5f9dfa181572c8cf839294fec" Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.686498 4848 scope.go:117] "RemoveContainer" containerID="ce2877837768ec2ae86d3c12911ada1a5fbce99555a6e64b5505a1518eb9152e" Feb 17 09:31:51 crc kubenswrapper[4848]: E0217 09:31:51.687052 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce2877837768ec2ae86d3c12911ada1a5fbce99555a6e64b5505a1518eb9152e\": container with ID starting with ce2877837768ec2ae86d3c12911ada1a5fbce99555a6e64b5505a1518eb9152e not found: ID does not exist" containerID="ce2877837768ec2ae86d3c12911ada1a5fbce99555a6e64b5505a1518eb9152e" Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.687095 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce2877837768ec2ae86d3c12911ada1a5fbce99555a6e64b5505a1518eb9152e"} err="failed to get container status \"ce2877837768ec2ae86d3c12911ada1a5fbce99555a6e64b5505a1518eb9152e\": rpc error: code = NotFound desc = could not find container \"ce2877837768ec2ae86d3c12911ada1a5fbce99555a6e64b5505a1518eb9152e\": container with ID starting with ce2877837768ec2ae86d3c12911ada1a5fbce99555a6e64b5505a1518eb9152e not found: ID does not exist" Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.687129 4848 scope.go:117] "RemoveContainer" containerID="add40db3899cb47e2e8ae125a281417662e59d232c97bdbe981e718d1173932b" Feb 17 09:31:51 crc kubenswrapper[4848]: E0217 09:31:51.687429 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"add40db3899cb47e2e8ae125a281417662e59d232c97bdbe981e718d1173932b\": container with ID starting with add40db3899cb47e2e8ae125a281417662e59d232c97bdbe981e718d1173932b not found: ID does not exist" containerID="add40db3899cb47e2e8ae125a281417662e59d232c97bdbe981e718d1173932b" Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.687455 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"add40db3899cb47e2e8ae125a281417662e59d232c97bdbe981e718d1173932b"} err="failed to get container status \"add40db3899cb47e2e8ae125a281417662e59d232c97bdbe981e718d1173932b\": rpc error: code = NotFound desc = could not find container \"add40db3899cb47e2e8ae125a281417662e59d232c97bdbe981e718d1173932b\": container with ID starting with add40db3899cb47e2e8ae125a281417662e59d232c97bdbe981e718d1173932b not found: ID does not exist" Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.687471 4848 scope.go:117] "RemoveContainer" containerID="4060d0bd61b74303e7d6d15e10cfaa387d64ebf5f9dfa181572c8cf839294fec" Feb 17 09:31:51 crc kubenswrapper[4848]: E0217 09:31:51.687694 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4060d0bd61b74303e7d6d15e10cfaa387d64ebf5f9dfa181572c8cf839294fec\": container with ID starting with 4060d0bd61b74303e7d6d15e10cfaa387d64ebf5f9dfa181572c8cf839294fec not found: ID does not exist" containerID="4060d0bd61b74303e7d6d15e10cfaa387d64ebf5f9dfa181572c8cf839294fec" Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.687719 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4060d0bd61b74303e7d6d15e10cfaa387d64ebf5f9dfa181572c8cf839294fec"} err="failed to get container status \"4060d0bd61b74303e7d6d15e10cfaa387d64ebf5f9dfa181572c8cf839294fec\": rpc error: code = NotFound desc = could not find container \"4060d0bd61b74303e7d6d15e10cfaa387d64ebf5f9dfa181572c8cf839294fec\": container with ID starting with 4060d0bd61b74303e7d6d15e10cfaa387d64ebf5f9dfa181572c8cf839294fec not found: ID does not exist" Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.732955 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31a42093-0072-44a7-9357-58c67e413a4c-utilities\") pod \"31a42093-0072-44a7-9357-58c67e413a4c\" (UID: \"31a42093-0072-44a7-9357-58c67e413a4c\") " Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.733439 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31a42093-0072-44a7-9357-58c67e413a4c-catalog-content\") pod \"31a42093-0072-44a7-9357-58c67e413a4c\" (UID: \"31a42093-0072-44a7-9357-58c67e413a4c\") " Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.733661 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2pvd\" (UniqueName: \"kubernetes.io/projected/31a42093-0072-44a7-9357-58c67e413a4c-kube-api-access-t2pvd\") pod \"31a42093-0072-44a7-9357-58c67e413a4c\" (UID: \"31a42093-0072-44a7-9357-58c67e413a4c\") " Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.733986 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31a42093-0072-44a7-9357-58c67e413a4c-utilities" (OuterVolumeSpecName: "utilities") pod "31a42093-0072-44a7-9357-58c67e413a4c" (UID: "31a42093-0072-44a7-9357-58c67e413a4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.734621 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31a42093-0072-44a7-9357-58c67e413a4c-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.739689 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31a42093-0072-44a7-9357-58c67e413a4c-kube-api-access-t2pvd" (OuterVolumeSpecName: "kube-api-access-t2pvd") pod "31a42093-0072-44a7-9357-58c67e413a4c" (UID: "31a42093-0072-44a7-9357-58c67e413a4c"). InnerVolumeSpecName "kube-api-access-t2pvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.755261 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31a42093-0072-44a7-9357-58c67e413a4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "31a42093-0072-44a7-9357-58c67e413a4c" (UID: "31a42093-0072-44a7-9357-58c67e413a4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.836373 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2pvd\" (UniqueName: \"kubernetes.io/projected/31a42093-0072-44a7-9357-58c67e413a4c-kube-api-access-t2pvd\") on node \"crc\" DevicePath \"\"" Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.836407 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31a42093-0072-44a7-9357-58c67e413a4c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.934628 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cdr5v"] Feb 17 09:31:51 crc kubenswrapper[4848]: I0217 09:31:51.947525 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cdr5v"] Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.115025 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.266725 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c1fceab-33b4-4eee-8e26-c9bc2a35f018-inventory\") pod \"9c1fceab-33b4-4eee-8e26-c9bc2a35f018\" (UID: \"9c1fceab-33b4-4eee-8e26-c9bc2a35f018\") " Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.267372 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c1fceab-33b4-4eee-8e26-c9bc2a35f018-ssh-key-openstack-edpm-ipam\") pod \"9c1fceab-33b4-4eee-8e26-c9bc2a35f018\" (UID: \"9c1fceab-33b4-4eee-8e26-c9bc2a35f018\") " Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.267420 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9j9x\" (UniqueName: \"kubernetes.io/projected/9c1fceab-33b4-4eee-8e26-c9bc2a35f018-kube-api-access-l9j9x\") pod \"9c1fceab-33b4-4eee-8e26-c9bc2a35f018\" (UID: \"9c1fceab-33b4-4eee-8e26-c9bc2a35f018\") " Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.272861 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c1fceab-33b4-4eee-8e26-c9bc2a35f018-kube-api-access-l9j9x" (OuterVolumeSpecName: "kube-api-access-l9j9x") pod "9c1fceab-33b4-4eee-8e26-c9bc2a35f018" (UID: "9c1fceab-33b4-4eee-8e26-c9bc2a35f018"). InnerVolumeSpecName "kube-api-access-l9j9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.301277 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c1fceab-33b4-4eee-8e26-c9bc2a35f018-inventory" (OuterVolumeSpecName: "inventory") pod "9c1fceab-33b4-4eee-8e26-c9bc2a35f018" (UID: "9c1fceab-33b4-4eee-8e26-c9bc2a35f018"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.312533 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c1fceab-33b4-4eee-8e26-c9bc2a35f018-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9c1fceab-33b4-4eee-8e26-c9bc2a35f018" (UID: "9c1fceab-33b4-4eee-8e26-c9bc2a35f018"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.370428 4848 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9c1fceab-33b4-4eee-8e26-c9bc2a35f018-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.370499 4848 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9c1fceab-33b4-4eee-8e26-c9bc2a35f018-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.370526 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9j9x\" (UniqueName: \"kubernetes.io/projected/9c1fceab-33b4-4eee-8e26-c9bc2a35f018-kube-api-access-l9j9x\") on node \"crc\" DevicePath \"\"" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.416643 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31a42093-0072-44a7-9357-58c67e413a4c" path="/var/lib/kubelet/pods/31a42093-0072-44a7-9357-58c67e413a4c/volumes" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.607616 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv" event={"ID":"9c1fceab-33b4-4eee-8e26-c9bc2a35f018","Type":"ContainerDied","Data":"7cf4b2b87153cf9fb6f58adf2e80a10b15bba91d4ee087d98fa149102460aee0"} Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.607657 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7cf4b2b87153cf9fb6f58adf2e80a10b15bba91d4ee087d98fa149102460aee0" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.607707 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.704096 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd"] Feb 17 09:31:53 crc kubenswrapper[4848]: E0217 09:31:53.704518 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a42093-0072-44a7-9357-58c67e413a4c" containerName="extract-content" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.704530 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a42093-0072-44a7-9357-58c67e413a4c" containerName="extract-content" Feb 17 09:31:53 crc kubenswrapper[4848]: E0217 09:31:53.704541 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a42093-0072-44a7-9357-58c67e413a4c" containerName="extract-utilities" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.704549 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a42093-0072-44a7-9357-58c67e413a4c" containerName="extract-utilities" Feb 17 09:31:53 crc kubenswrapper[4848]: E0217 09:31:53.704569 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c1fceab-33b4-4eee-8e26-c9bc2a35f018" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.704577 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c1fceab-33b4-4eee-8e26-c9bc2a35f018" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 17 09:31:53 crc kubenswrapper[4848]: E0217 09:31:53.704591 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a42093-0072-44a7-9357-58c67e413a4c" containerName="registry-server" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.704597 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a42093-0072-44a7-9357-58c67e413a4c" containerName="registry-server" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.704807 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c1fceab-33b4-4eee-8e26-c9bc2a35f018" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.704828 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="31a42093-0072-44a7-9357-58c67e413a4c" containerName="registry-server" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.705403 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.709430 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.709554 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.709622 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.709859 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-df2nq" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.727631 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd"] Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.879678 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31a5681d-60a0-455a-af52-e43f66fb1e93-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd\" (UID: \"31a5681d-60a0-455a-af52-e43f66fb1e93\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.879843 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnr8b\" (UniqueName: \"kubernetes.io/projected/31a5681d-60a0-455a-af52-e43f66fb1e93-kube-api-access-bnr8b\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd\" (UID: \"31a5681d-60a0-455a-af52-e43f66fb1e93\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.879871 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31a5681d-60a0-455a-af52-e43f66fb1e93-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd\" (UID: \"31a5681d-60a0-455a-af52-e43f66fb1e93\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.982382 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnr8b\" (UniqueName: \"kubernetes.io/projected/31a5681d-60a0-455a-af52-e43f66fb1e93-kube-api-access-bnr8b\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd\" (UID: \"31a5681d-60a0-455a-af52-e43f66fb1e93\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.982440 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31a5681d-60a0-455a-af52-e43f66fb1e93-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd\" (UID: \"31a5681d-60a0-455a-af52-e43f66fb1e93\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.982588 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31a5681d-60a0-455a-af52-e43f66fb1e93-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd\" (UID: \"31a5681d-60a0-455a-af52-e43f66fb1e93\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.987239 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31a5681d-60a0-455a-af52-e43f66fb1e93-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd\" (UID: \"31a5681d-60a0-455a-af52-e43f66fb1e93\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd" Feb 17 09:31:53 crc kubenswrapper[4848]: I0217 09:31:53.988065 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31a5681d-60a0-455a-af52-e43f66fb1e93-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd\" (UID: \"31a5681d-60a0-455a-af52-e43f66fb1e93\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd" Feb 17 09:31:54 crc kubenswrapper[4848]: I0217 09:31:54.004372 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnr8b\" (UniqueName: \"kubernetes.io/projected/31a5681d-60a0-455a-af52-e43f66fb1e93-kube-api-access-bnr8b\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd\" (UID: \"31a5681d-60a0-455a-af52-e43f66fb1e93\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd" Feb 17 09:31:54 crc kubenswrapper[4848]: I0217 09:31:54.065532 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd" Feb 17 09:31:54 crc kubenswrapper[4848]: I0217 09:31:54.633412 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd"] Feb 17 09:31:55 crc kubenswrapper[4848]: I0217 09:31:55.633052 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd" event={"ID":"31a5681d-60a0-455a-af52-e43f66fb1e93","Type":"ContainerStarted","Data":"b53b25e0896d6f3c7b67b5ffdfe499692da01cdadf4727759b191fa6ffe4bdd3"} Feb 17 09:31:55 crc kubenswrapper[4848]: I0217 09:31:55.633390 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd" event={"ID":"31a5681d-60a0-455a-af52-e43f66fb1e93","Type":"ContainerStarted","Data":"c1bfeba469e4f2042a429853b75a6f20c444e733c6bd02b427f948354973a0de"} Feb 17 09:31:55 crc kubenswrapper[4848]: I0217 09:31:55.660956 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd" podStartSLOduration=2.218975 podStartE2EDuration="2.660930686s" podCreationTimestamp="2026-02-17 09:31:53 +0000 UTC" firstStartedPulling="2026-02-17 09:31:54.644943437 +0000 UTC m=+1592.188199083" lastFinishedPulling="2026-02-17 09:31:55.086899113 +0000 UTC m=+1592.630154769" observedRunningTime="2026-02-17 09:31:55.649395016 +0000 UTC m=+1593.192650702" watchObservedRunningTime="2026-02-17 09:31:55.660930686 +0000 UTC m=+1593.204186362" Feb 17 09:31:59 crc kubenswrapper[4848]: I0217 09:31:59.669128 4848 generic.go:334] "Generic (PLEG): container finished" podID="31a5681d-60a0-455a-af52-e43f66fb1e93" containerID="b53b25e0896d6f3c7b67b5ffdfe499692da01cdadf4727759b191fa6ffe4bdd3" exitCode=0 Feb 17 09:31:59 crc kubenswrapper[4848]: I0217 09:31:59.669261 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd" event={"ID":"31a5681d-60a0-455a-af52-e43f66fb1e93","Type":"ContainerDied","Data":"b53b25e0896d6f3c7b67b5ffdfe499692da01cdadf4727759b191fa6ffe4bdd3"} Feb 17 09:32:01 crc kubenswrapper[4848]: I0217 09:32:01.159945 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd" Feb 17 09:32:01 crc kubenswrapper[4848]: I0217 09:32:01.322513 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31a5681d-60a0-455a-af52-e43f66fb1e93-ssh-key-openstack-edpm-ipam\") pod \"31a5681d-60a0-455a-af52-e43f66fb1e93\" (UID: \"31a5681d-60a0-455a-af52-e43f66fb1e93\") " Feb 17 09:32:01 crc kubenswrapper[4848]: I0217 09:32:01.322820 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnr8b\" (UniqueName: \"kubernetes.io/projected/31a5681d-60a0-455a-af52-e43f66fb1e93-kube-api-access-bnr8b\") pod \"31a5681d-60a0-455a-af52-e43f66fb1e93\" (UID: \"31a5681d-60a0-455a-af52-e43f66fb1e93\") " Feb 17 09:32:01 crc kubenswrapper[4848]: I0217 09:32:01.322926 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31a5681d-60a0-455a-af52-e43f66fb1e93-inventory\") pod \"31a5681d-60a0-455a-af52-e43f66fb1e93\" (UID: \"31a5681d-60a0-455a-af52-e43f66fb1e93\") " Feb 17 09:32:01 crc kubenswrapper[4848]: I0217 09:32:01.336933 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31a5681d-60a0-455a-af52-e43f66fb1e93-kube-api-access-bnr8b" (OuterVolumeSpecName: "kube-api-access-bnr8b") pod "31a5681d-60a0-455a-af52-e43f66fb1e93" (UID: "31a5681d-60a0-455a-af52-e43f66fb1e93"). InnerVolumeSpecName "kube-api-access-bnr8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:32:01 crc kubenswrapper[4848]: I0217 09:32:01.368298 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a5681d-60a0-455a-af52-e43f66fb1e93-inventory" (OuterVolumeSpecName: "inventory") pod "31a5681d-60a0-455a-af52-e43f66fb1e93" (UID: "31a5681d-60a0-455a-af52-e43f66fb1e93"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:32:01 crc kubenswrapper[4848]: I0217 09:32:01.381732 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31a5681d-60a0-455a-af52-e43f66fb1e93-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "31a5681d-60a0-455a-af52-e43f66fb1e93" (UID: "31a5681d-60a0-455a-af52-e43f66fb1e93"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:32:01 crc kubenswrapper[4848]: I0217 09:32:01.424717 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnr8b\" (UniqueName: \"kubernetes.io/projected/31a5681d-60a0-455a-af52-e43f66fb1e93-kube-api-access-bnr8b\") on node \"crc\" DevicePath \"\"" Feb 17 09:32:01 crc kubenswrapper[4848]: I0217 09:32:01.424743 4848 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/31a5681d-60a0-455a-af52-e43f66fb1e93-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 09:32:01 crc kubenswrapper[4848]: I0217 09:32:01.424752 4848 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/31a5681d-60a0-455a-af52-e43f66fb1e93-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 09:32:01 crc kubenswrapper[4848]: I0217 09:32:01.696210 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd" event={"ID":"31a5681d-60a0-455a-af52-e43f66fb1e93","Type":"ContainerDied","Data":"c1bfeba469e4f2042a429853b75a6f20c444e733c6bd02b427f948354973a0de"} Feb 17 09:32:01 crc kubenswrapper[4848]: I0217 09:32:01.696249 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd" Feb 17 09:32:01 crc kubenswrapper[4848]: I0217 09:32:01.696296 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1bfeba469e4f2042a429853b75a6f20c444e733c6bd02b427f948354973a0de" Feb 17 09:32:01 crc kubenswrapper[4848]: I0217 09:32:01.775691 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-zp8bp"] Feb 17 09:32:01 crc kubenswrapper[4848]: E0217 09:32:01.776227 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31a5681d-60a0-455a-af52-e43f66fb1e93" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 17 09:32:01 crc kubenswrapper[4848]: I0217 09:32:01.776250 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="31a5681d-60a0-455a-af52-e43f66fb1e93" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 17 09:32:01 crc kubenswrapper[4848]: I0217 09:32:01.776485 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="31a5681d-60a0-455a-af52-e43f66fb1e93" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 17 09:32:01 crc kubenswrapper[4848]: I0217 09:32:01.777263 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zp8bp" Feb 17 09:32:01 crc kubenswrapper[4848]: I0217 09:32:01.780340 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 09:32:01 crc kubenswrapper[4848]: I0217 09:32:01.780462 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-df2nq" Feb 17 09:32:01 crc kubenswrapper[4848]: I0217 09:32:01.780583 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 09:32:01 crc kubenswrapper[4848]: I0217 09:32:01.780647 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 09:32:01 crc kubenswrapper[4848]: I0217 09:32:01.810895 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-zp8bp"] Feb 17 09:32:01 crc kubenswrapper[4848]: I0217 09:32:01.933940 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zp8bp\" (UID: \"55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zp8bp" Feb 17 09:32:01 crc kubenswrapper[4848]: I0217 09:32:01.934215 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zp8bp\" (UID: \"55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zp8bp" Feb 17 09:32:01 crc kubenswrapper[4848]: I0217 09:32:01.934329 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cdnt\" (UniqueName: \"kubernetes.io/projected/55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f-kube-api-access-9cdnt\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zp8bp\" (UID: \"55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zp8bp" Feb 17 09:32:02 crc kubenswrapper[4848]: I0217 09:32:02.036639 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zp8bp\" (UID: \"55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zp8bp" Feb 17 09:32:02 crc kubenswrapper[4848]: I0217 09:32:02.036756 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zp8bp\" (UID: \"55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zp8bp" Feb 17 09:32:02 crc kubenswrapper[4848]: I0217 09:32:02.036863 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cdnt\" (UniqueName: \"kubernetes.io/projected/55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f-kube-api-access-9cdnt\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zp8bp\" (UID: \"55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zp8bp" Feb 17 09:32:02 crc kubenswrapper[4848]: I0217 09:32:02.041464 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zp8bp\" (UID: \"55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zp8bp" Feb 17 09:32:02 crc kubenswrapper[4848]: I0217 09:32:02.041902 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zp8bp\" (UID: \"55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zp8bp" Feb 17 09:32:02 crc kubenswrapper[4848]: I0217 09:32:02.067727 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cdnt\" (UniqueName: \"kubernetes.io/projected/55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f-kube-api-access-9cdnt\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-zp8bp\" (UID: \"55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zp8bp" Feb 17 09:32:02 crc kubenswrapper[4848]: I0217 09:32:02.074952 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-j6p6m"] Feb 17 09:32:02 crc kubenswrapper[4848]: I0217 09:32:02.087095 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-j6p6m"] Feb 17 09:32:02 crc kubenswrapper[4848]: I0217 09:32:02.095645 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zp8bp" Feb 17 09:32:02 crc kubenswrapper[4848]: I0217 09:32:02.710621 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-zp8bp"] Feb 17 09:32:02 crc kubenswrapper[4848]: W0217 09:32:02.740052 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55a2808a_1ccd_4a2b_bf2f_25e7ea8c069f.slice/crio-3633911c7e20964e389ae6fcd9ec765d54c7ef718a407e3cfb69688dab57bb93 WatchSource:0}: Error finding container 3633911c7e20964e389ae6fcd9ec765d54c7ef718a407e3cfb69688dab57bb93: Status 404 returned error can't find the container with id 3633911c7e20964e389ae6fcd9ec765d54c7ef718a407e3cfb69688dab57bb93 Feb 17 09:32:03 crc kubenswrapper[4848]: I0217 09:32:03.412829 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dd600b5-f56b-460a-acb0-a4dc5fd2de23" path="/var/lib/kubelet/pods/0dd600b5-f56b-460a-acb0-a4dc5fd2de23/volumes" Feb 17 09:32:03 crc kubenswrapper[4848]: I0217 09:32:03.727018 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zp8bp" event={"ID":"55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f","Type":"ContainerStarted","Data":"97de6850f550f70883e39188581581fec1e2cf5228aa601b7cf74bfad0577f8a"} Feb 17 09:32:03 crc kubenswrapper[4848]: I0217 09:32:03.728127 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zp8bp" event={"ID":"55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f","Type":"ContainerStarted","Data":"3633911c7e20964e389ae6fcd9ec765d54c7ef718a407e3cfb69688dab57bb93"} Feb 17 09:32:03 crc kubenswrapper[4848]: I0217 09:32:03.758554 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zp8bp" podStartSLOduration=2.2917398540000002 podStartE2EDuration="2.75852522s" podCreationTimestamp="2026-02-17 09:32:01 +0000 UTC" firstStartedPulling="2026-02-17 09:32:02.743109617 +0000 UTC m=+1600.286365273" lastFinishedPulling="2026-02-17 09:32:03.209894993 +0000 UTC m=+1600.753150639" observedRunningTime="2026-02-17 09:32:03.744269353 +0000 UTC m=+1601.287524999" watchObservedRunningTime="2026-02-17 09:32:03.75852522 +0000 UTC m=+1601.301780896" Feb 17 09:32:14 crc kubenswrapper[4848]: I0217 09:32:14.046892 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-jq6l8"] Feb 17 09:32:14 crc kubenswrapper[4848]: I0217 09:32:14.058639 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-ptthx"] Feb 17 09:32:14 crc kubenswrapper[4848]: I0217 09:32:14.070862 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-jq6l8"] Feb 17 09:32:14 crc kubenswrapper[4848]: I0217 09:32:14.081352 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-ptthx"] Feb 17 09:32:15 crc kubenswrapper[4848]: I0217 09:32:15.037167 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-lh5dl"] Feb 17 09:32:15 crc kubenswrapper[4848]: I0217 09:32:15.050187 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-lh5dl"] Feb 17 09:32:15 crc kubenswrapper[4848]: I0217 09:32:15.401148 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e0d87d6-fd65-4565-971f-070050f2f9ff" path="/var/lib/kubelet/pods/9e0d87d6-fd65-4565-971f-070050f2f9ff/volumes" Feb 17 09:32:15 crc kubenswrapper[4848]: I0217 09:32:15.402263 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba8c67b5-216a-4f60-baee-c1b6211f89ec" path="/var/lib/kubelet/pods/ba8c67b5-216a-4f60-baee-c1b6211f89ec/volumes" Feb 17 09:32:15 crc kubenswrapper[4848]: I0217 09:32:15.403275 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcb8e782-fe60-4c41-a843-1980fc8ab3cc" path="/var/lib/kubelet/pods/dcb8e782-fe60-4c41-a843-1980fc8ab3cc/volumes" Feb 17 09:32:18 crc kubenswrapper[4848]: I0217 09:32:18.772618 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:32:18 crc kubenswrapper[4848]: I0217 09:32:18.773066 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:32:27 crc kubenswrapper[4848]: I0217 09:32:27.309924 4848 scope.go:117] "RemoveContainer" containerID="2b466d5e14ae33a601d99defdc04b1b4d7027f30f25820940d7dcb74da3162bc" Feb 17 09:32:27 crc kubenswrapper[4848]: I0217 09:32:27.357459 4848 scope.go:117] "RemoveContainer" containerID="b017242cfa16fac3f6a3292dc1d6ae30e4633ac9f56cd1d03d93c208f868655b" Feb 17 09:32:27 crc kubenswrapper[4848]: I0217 09:32:27.442881 4848 scope.go:117] "RemoveContainer" containerID="6074f51855bf7d05823997c5ea7977da776d02f7b85e02c3d39e1c949344f06b" Feb 17 09:32:27 crc kubenswrapper[4848]: I0217 09:32:27.492369 4848 scope.go:117] "RemoveContainer" containerID="88d0e2594cadd88bd48db0488f569383e741546126bbcdfc360e84ebf2c60057" Feb 17 09:32:27 crc kubenswrapper[4848]: I0217 09:32:27.530569 4848 scope.go:117] "RemoveContainer" containerID="e0a8bd53dc65343b0d928854371b7e08c32dce5bf61b758ff61ed51429aa128d" Feb 17 09:32:40 crc kubenswrapper[4848]: I0217 09:32:40.202421 4848 generic.go:334] "Generic (PLEG): container finished" podID="55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f" containerID="97de6850f550f70883e39188581581fec1e2cf5228aa601b7cf74bfad0577f8a" exitCode=0 Feb 17 09:32:40 crc kubenswrapper[4848]: I0217 09:32:40.202514 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zp8bp" event={"ID":"55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f","Type":"ContainerDied","Data":"97de6850f550f70883e39188581581fec1e2cf5228aa601b7cf74bfad0577f8a"} Feb 17 09:32:41 crc kubenswrapper[4848]: I0217 09:32:41.608405 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zp8bp" Feb 17 09:32:41 crc kubenswrapper[4848]: I0217 09:32:41.695592 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f-inventory\") pod \"55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f\" (UID: \"55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f\") " Feb 17 09:32:41 crc kubenswrapper[4848]: I0217 09:32:41.696082 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cdnt\" (UniqueName: \"kubernetes.io/projected/55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f-kube-api-access-9cdnt\") pod \"55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f\" (UID: \"55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f\") " Feb 17 09:32:41 crc kubenswrapper[4848]: I0217 09:32:41.696172 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f-ssh-key-openstack-edpm-ipam\") pod \"55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f\" (UID: \"55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f\") " Feb 17 09:32:41 crc kubenswrapper[4848]: I0217 09:32:41.702427 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f-kube-api-access-9cdnt" (OuterVolumeSpecName: "kube-api-access-9cdnt") pod "55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f" (UID: "55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f"). InnerVolumeSpecName "kube-api-access-9cdnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:32:41 crc kubenswrapper[4848]: I0217 09:32:41.728314 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f-inventory" (OuterVolumeSpecName: "inventory") pod "55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f" (UID: "55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:32:41 crc kubenswrapper[4848]: I0217 09:32:41.730900 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f" (UID: "55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:32:41 crc kubenswrapper[4848]: I0217 09:32:41.800080 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cdnt\" (UniqueName: \"kubernetes.io/projected/55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f-kube-api-access-9cdnt\") on node \"crc\" DevicePath \"\"" Feb 17 09:32:41 crc kubenswrapper[4848]: I0217 09:32:41.800384 4848 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 09:32:41 crc kubenswrapper[4848]: I0217 09:32:41.800551 4848 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 09:32:42 crc kubenswrapper[4848]: I0217 09:32:42.228688 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zp8bp" event={"ID":"55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f","Type":"ContainerDied","Data":"3633911c7e20964e389ae6fcd9ec765d54c7ef718a407e3cfb69688dab57bb93"} Feb 17 09:32:42 crc kubenswrapper[4848]: I0217 09:32:42.229096 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3633911c7e20964e389ae6fcd9ec765d54c7ef718a407e3cfb69688dab57bb93" Feb 17 09:32:42 crc kubenswrapper[4848]: I0217 09:32:42.228885 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-zp8bp" Feb 17 09:32:42 crc kubenswrapper[4848]: I0217 09:32:42.468384 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h"] Feb 17 09:32:42 crc kubenswrapper[4848]: E0217 09:32:42.469002 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 17 09:32:42 crc kubenswrapper[4848]: I0217 09:32:42.469057 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 17 09:32:42 crc kubenswrapper[4848]: I0217 09:32:42.469440 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 17 09:32:42 crc kubenswrapper[4848]: I0217 09:32:42.470520 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h" Feb 17 09:32:42 crc kubenswrapper[4848]: I0217 09:32:42.478174 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 09:32:42 crc kubenswrapper[4848]: I0217 09:32:42.478719 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 09:32:42 crc kubenswrapper[4848]: I0217 09:32:42.479360 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-df2nq" Feb 17 09:32:42 crc kubenswrapper[4848]: I0217 09:32:42.479877 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 09:32:42 crc kubenswrapper[4848]: I0217 09:32:42.485155 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h"] Feb 17 09:32:42 crc kubenswrapper[4848]: I0217 09:32:42.622543 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbf54dd4-b933-400a-bef2-44bc87fbf3de-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h\" (UID: \"cbf54dd4-b933-400a-bef2-44bc87fbf3de\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h" Feb 17 09:32:42 crc kubenswrapper[4848]: I0217 09:32:42.623170 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjzn9\" (UniqueName: \"kubernetes.io/projected/cbf54dd4-b933-400a-bef2-44bc87fbf3de-kube-api-access-kjzn9\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h\" (UID: \"cbf54dd4-b933-400a-bef2-44bc87fbf3de\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h" Feb 17 09:32:42 crc kubenswrapper[4848]: I0217 09:32:42.623417 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbf54dd4-b933-400a-bef2-44bc87fbf3de-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h\" (UID: \"cbf54dd4-b933-400a-bef2-44bc87fbf3de\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h" Feb 17 09:32:42 crc kubenswrapper[4848]: I0217 09:32:42.725774 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbf54dd4-b933-400a-bef2-44bc87fbf3de-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h\" (UID: \"cbf54dd4-b933-400a-bef2-44bc87fbf3de\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h" Feb 17 09:32:42 crc kubenswrapper[4848]: I0217 09:32:42.725985 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbf54dd4-b933-400a-bef2-44bc87fbf3de-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h\" (UID: \"cbf54dd4-b933-400a-bef2-44bc87fbf3de\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h" Feb 17 09:32:42 crc kubenswrapper[4848]: I0217 09:32:42.726158 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjzn9\" (UniqueName: \"kubernetes.io/projected/cbf54dd4-b933-400a-bef2-44bc87fbf3de-kube-api-access-kjzn9\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h\" (UID: \"cbf54dd4-b933-400a-bef2-44bc87fbf3de\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h" Feb 17 09:32:42 crc kubenswrapper[4848]: I0217 09:32:42.734144 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbf54dd4-b933-400a-bef2-44bc87fbf3de-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h\" (UID: \"cbf54dd4-b933-400a-bef2-44bc87fbf3de\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h" Feb 17 09:32:42 crc kubenswrapper[4848]: I0217 09:32:42.736414 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbf54dd4-b933-400a-bef2-44bc87fbf3de-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h\" (UID: \"cbf54dd4-b933-400a-bef2-44bc87fbf3de\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h" Feb 17 09:32:42 crc kubenswrapper[4848]: I0217 09:32:42.746294 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjzn9\" (UniqueName: \"kubernetes.io/projected/cbf54dd4-b933-400a-bef2-44bc87fbf3de-kube-api-access-kjzn9\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h\" (UID: \"cbf54dd4-b933-400a-bef2-44bc87fbf3de\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h" Feb 17 09:32:42 crc kubenswrapper[4848]: I0217 09:32:42.823968 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h" Feb 17 09:32:43 crc kubenswrapper[4848]: W0217 09:32:43.365860 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbf54dd4_b933_400a_bef2_44bc87fbf3de.slice/crio-c26c3a423d401fe3d04f07b80462fd0c605c87167e49be599dee31bab4a28c4a WatchSource:0}: Error finding container c26c3a423d401fe3d04f07b80462fd0c605c87167e49be599dee31bab4a28c4a: Status 404 returned error can't find the container with id c26c3a423d401fe3d04f07b80462fd0c605c87167e49be599dee31bab4a28c4a Feb 17 09:32:43 crc kubenswrapper[4848]: I0217 09:32:43.369941 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h"] Feb 17 09:32:44 crc kubenswrapper[4848]: I0217 09:32:44.260504 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h" event={"ID":"cbf54dd4-b933-400a-bef2-44bc87fbf3de","Type":"ContainerStarted","Data":"8d77099fa8cfaad124b9bf57430fce577098c11c5e604253d6e6b3b98d67910c"} Feb 17 09:32:44 crc kubenswrapper[4848]: I0217 09:32:44.260864 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h" event={"ID":"cbf54dd4-b933-400a-bef2-44bc87fbf3de","Type":"ContainerStarted","Data":"c26c3a423d401fe3d04f07b80462fd0c605c87167e49be599dee31bab4a28c4a"} Feb 17 09:32:44 crc kubenswrapper[4848]: I0217 09:32:44.293802 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h" podStartSLOduration=1.828699092 podStartE2EDuration="2.29375212s" podCreationTimestamp="2026-02-17 09:32:42 +0000 UTC" firstStartedPulling="2026-02-17 09:32:43.368501694 +0000 UTC m=+1640.911757340" lastFinishedPulling="2026-02-17 09:32:43.833554702 +0000 UTC m=+1641.376810368" observedRunningTime="2026-02-17 09:32:44.287276095 +0000 UTC m=+1641.830531751" watchObservedRunningTime="2026-02-17 09:32:44.29375212 +0000 UTC m=+1641.837007766" Feb 17 09:32:48 crc kubenswrapper[4848]: I0217 09:32:48.772140 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:32:48 crc kubenswrapper[4848]: I0217 09:32:48.772720 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:32:58 crc kubenswrapper[4848]: I0217 09:32:58.102198 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-xqmdf"] Feb 17 09:32:58 crc kubenswrapper[4848]: I0217 09:32:58.112888 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-ef1a-account-create-update-szj2h"] Feb 17 09:32:58 crc kubenswrapper[4848]: I0217 09:32:58.124026 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-xqmdf"] Feb 17 09:32:58 crc kubenswrapper[4848]: I0217 09:32:58.132345 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-ef1a-account-create-update-szj2h"] Feb 17 09:32:59 crc kubenswrapper[4848]: I0217 09:32:59.038697 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-j9fzr"] Feb 17 09:32:59 crc kubenswrapper[4848]: I0217 09:32:59.046628 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-5bbe-account-create-update-9j27z"] Feb 17 09:32:59 crc kubenswrapper[4848]: I0217 09:32:59.055102 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-9nsmf"] Feb 17 09:32:59 crc kubenswrapper[4848]: I0217 09:32:59.062467 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-j9fzr"] Feb 17 09:32:59 crc kubenswrapper[4848]: I0217 09:32:59.070138 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-5bbe-account-create-update-9j27z"] Feb 17 09:32:59 crc kubenswrapper[4848]: I0217 09:32:59.076440 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-94f2-account-create-update-zdvm6"] Feb 17 09:32:59 crc kubenswrapper[4848]: I0217 09:32:59.086392 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-9nsmf"] Feb 17 09:32:59 crc kubenswrapper[4848]: I0217 09:32:59.095140 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-94f2-account-create-update-zdvm6"] Feb 17 09:32:59 crc kubenswrapper[4848]: I0217 09:32:59.403899 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ef72abf-423d-4df8-8d07-f2340b53ddff" path="/var/lib/kubelet/pods/2ef72abf-423d-4df8-8d07-f2340b53ddff/volumes" Feb 17 09:32:59 crc kubenswrapper[4848]: I0217 09:32:59.405040 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dc6db25-7a23-453e-b3e5-331167e6d51e" path="/var/lib/kubelet/pods/7dc6db25-7a23-453e-b3e5-331167e6d51e/volumes" Feb 17 09:32:59 crc kubenswrapper[4848]: I0217 09:32:59.406513 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e349818-833c-41bd-83dd-00ca201c100c" path="/var/lib/kubelet/pods/7e349818-833c-41bd-83dd-00ca201c100c/volumes" Feb 17 09:32:59 crc kubenswrapper[4848]: I0217 09:32:59.407731 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e88c051-4096-47e8-b941-4cee5e3971ef" path="/var/lib/kubelet/pods/8e88c051-4096-47e8-b941-4cee5e3971ef/volumes" Feb 17 09:32:59 crc kubenswrapper[4848]: I0217 09:32:59.410537 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa12092e-4e11-4aa1-a495-383d39eb7806" path="/var/lib/kubelet/pods/aa12092e-4e11-4aa1-a495-383d39eb7806/volumes" Feb 17 09:32:59 crc kubenswrapper[4848]: I0217 09:32:59.411457 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0" path="/var/lib/kubelet/pods/cdacd3e3-6fa8-4964-8a0b-f97b7d2732d0/volumes" Feb 17 09:33:18 crc kubenswrapper[4848]: I0217 09:33:18.772415 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:33:18 crc kubenswrapper[4848]: I0217 09:33:18.773140 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:33:18 crc kubenswrapper[4848]: I0217 09:33:18.773208 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:33:18 crc kubenswrapper[4848]: I0217 09:33:18.774339 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521"} pod="openshift-machine-config-operator/machine-config-daemon-stvnz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 09:33:18 crc kubenswrapper[4848]: I0217 09:33:18.774437 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" containerID="cri-o://e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" gracePeriod=600 Feb 17 09:33:18 crc kubenswrapper[4848]: E0217 09:33:18.907697 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:33:19 crc kubenswrapper[4848]: I0217 09:33:19.654990 4848 generic.go:334] "Generic (PLEG): container finished" podID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" exitCode=0 Feb 17 09:33:19 crc kubenswrapper[4848]: I0217 09:33:19.655039 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerDied","Data":"e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521"} Feb 17 09:33:19 crc kubenswrapper[4848]: I0217 09:33:19.655848 4848 scope.go:117] "RemoveContainer" containerID="7308c58dc682de7c0d5cf3657a75860bc4b9b36b777a9bde17b6a927094a6302" Feb 17 09:33:19 crc kubenswrapper[4848]: I0217 09:33:19.656838 4848 scope.go:117] "RemoveContainer" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" Feb 17 09:33:19 crc kubenswrapper[4848]: E0217 09:33:19.657387 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:33:27 crc kubenswrapper[4848]: I0217 09:33:27.650750 4848 scope.go:117] "RemoveContainer" containerID="ac0420ac46441099abbb97df4471c45378300aa407c295c6dc5601869f8cfc14" Feb 17 09:33:27 crc kubenswrapper[4848]: I0217 09:33:27.680007 4848 scope.go:117] "RemoveContainer" containerID="9c748faf11b96df9c24f648de549bf9743322f6ee4f0112dc2c322a8f7e8b421" Feb 17 09:33:27 crc kubenswrapper[4848]: I0217 09:33:27.757784 4848 scope.go:117] "RemoveContainer" containerID="68ac475da06f9e5cb5def7b9378fb02469850335465993b12efa2f8f2e38f3db" Feb 17 09:33:27 crc kubenswrapper[4848]: I0217 09:33:27.806518 4848 scope.go:117] "RemoveContainer" containerID="ca6d48b3e6baeb8fa9f233e1a3b83b8aaf878c5e7552689e75c656e2bb10f306" Feb 17 09:33:27 crc kubenswrapper[4848]: I0217 09:33:27.861141 4848 scope.go:117] "RemoveContainer" containerID="e66fb734dca6f821d231e6f7e98ca5f789b1694c1ec5e60d44cd0205a33b0419" Feb 17 09:33:27 crc kubenswrapper[4848]: I0217 09:33:27.909216 4848 scope.go:117] "RemoveContainer" containerID="df83fb9e444b434e99b55684e3b3dc7cea87afd2e1ac228f53efdf7fa987972b" Feb 17 09:33:28 crc kubenswrapper[4848]: I0217 09:33:28.065879 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5js8t"] Feb 17 09:33:28 crc kubenswrapper[4848]: I0217 09:33:28.080827 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-5js8t"] Feb 17 09:33:29 crc kubenswrapper[4848]: I0217 09:33:29.396111 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="910f96d1-b14a-49ea-9153-1fb90774711d" path="/var/lib/kubelet/pods/910f96d1-b14a-49ea-9153-1fb90774711d/volumes" Feb 17 09:33:29 crc kubenswrapper[4848]: I0217 09:33:29.766693 4848 generic.go:334] "Generic (PLEG): container finished" podID="cbf54dd4-b933-400a-bef2-44bc87fbf3de" containerID="8d77099fa8cfaad124b9bf57430fce577098c11c5e604253d6e6b3b98d67910c" exitCode=0 Feb 17 09:33:29 crc kubenswrapper[4848]: I0217 09:33:29.766815 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h" event={"ID":"cbf54dd4-b933-400a-bef2-44bc87fbf3de","Type":"ContainerDied","Data":"8d77099fa8cfaad124b9bf57430fce577098c11c5e604253d6e6b3b98d67910c"} Feb 17 09:33:31 crc kubenswrapper[4848]: I0217 09:33:31.270308 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h" Feb 17 09:33:31 crc kubenswrapper[4848]: I0217 09:33:31.373202 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbf54dd4-b933-400a-bef2-44bc87fbf3de-inventory\") pod \"cbf54dd4-b933-400a-bef2-44bc87fbf3de\" (UID: \"cbf54dd4-b933-400a-bef2-44bc87fbf3de\") " Feb 17 09:33:31 crc kubenswrapper[4848]: I0217 09:33:31.373395 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbf54dd4-b933-400a-bef2-44bc87fbf3de-ssh-key-openstack-edpm-ipam\") pod \"cbf54dd4-b933-400a-bef2-44bc87fbf3de\" (UID: \"cbf54dd4-b933-400a-bef2-44bc87fbf3de\") " Feb 17 09:33:31 crc kubenswrapper[4848]: I0217 09:33:31.373545 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjzn9\" (UniqueName: \"kubernetes.io/projected/cbf54dd4-b933-400a-bef2-44bc87fbf3de-kube-api-access-kjzn9\") pod \"cbf54dd4-b933-400a-bef2-44bc87fbf3de\" (UID: \"cbf54dd4-b933-400a-bef2-44bc87fbf3de\") " Feb 17 09:33:31 crc kubenswrapper[4848]: I0217 09:33:31.380670 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbf54dd4-b933-400a-bef2-44bc87fbf3de-kube-api-access-kjzn9" (OuterVolumeSpecName: "kube-api-access-kjzn9") pod "cbf54dd4-b933-400a-bef2-44bc87fbf3de" (UID: "cbf54dd4-b933-400a-bef2-44bc87fbf3de"). InnerVolumeSpecName "kube-api-access-kjzn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:33:31 crc kubenswrapper[4848]: I0217 09:33:31.416855 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbf54dd4-b933-400a-bef2-44bc87fbf3de-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cbf54dd4-b933-400a-bef2-44bc87fbf3de" (UID: "cbf54dd4-b933-400a-bef2-44bc87fbf3de"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:33:31 crc kubenswrapper[4848]: I0217 09:33:31.426018 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbf54dd4-b933-400a-bef2-44bc87fbf3de-inventory" (OuterVolumeSpecName: "inventory") pod "cbf54dd4-b933-400a-bef2-44bc87fbf3de" (UID: "cbf54dd4-b933-400a-bef2-44bc87fbf3de"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:33:31 crc kubenswrapper[4848]: I0217 09:33:31.477629 4848 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cbf54dd4-b933-400a-bef2-44bc87fbf3de-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 09:33:31 crc kubenswrapper[4848]: I0217 09:33:31.477661 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjzn9\" (UniqueName: \"kubernetes.io/projected/cbf54dd4-b933-400a-bef2-44bc87fbf3de-kube-api-access-kjzn9\") on node \"crc\" DevicePath \"\"" Feb 17 09:33:31 crc kubenswrapper[4848]: I0217 09:33:31.477676 4848 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cbf54dd4-b933-400a-bef2-44bc87fbf3de-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 09:33:31 crc kubenswrapper[4848]: I0217 09:33:31.786836 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h" event={"ID":"cbf54dd4-b933-400a-bef2-44bc87fbf3de","Type":"ContainerDied","Data":"c26c3a423d401fe3d04f07b80462fd0c605c87167e49be599dee31bab4a28c4a"} Feb 17 09:33:31 crc kubenswrapper[4848]: I0217 09:33:31.786885 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c26c3a423d401fe3d04f07b80462fd0c605c87167e49be599dee31bab4a28c4a" Feb 17 09:33:31 crc kubenswrapper[4848]: I0217 09:33:31.786951 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h" Feb 17 09:33:31 crc kubenswrapper[4848]: I0217 09:33:31.951309 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-6gvlr"] Feb 17 09:33:31 crc kubenswrapper[4848]: E0217 09:33:31.951826 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbf54dd4-b933-400a-bef2-44bc87fbf3de" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 17 09:33:31 crc kubenswrapper[4848]: I0217 09:33:31.951851 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbf54dd4-b933-400a-bef2-44bc87fbf3de" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 17 09:33:31 crc kubenswrapper[4848]: I0217 09:33:31.952071 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbf54dd4-b933-400a-bef2-44bc87fbf3de" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 17 09:33:31 crc kubenswrapper[4848]: I0217 09:33:31.952940 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-6gvlr" Feb 17 09:33:31 crc kubenswrapper[4848]: I0217 09:33:31.957432 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 09:33:31 crc kubenswrapper[4848]: I0217 09:33:31.957524 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 09:33:31 crc kubenswrapper[4848]: I0217 09:33:31.958161 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 09:33:31 crc kubenswrapper[4848]: I0217 09:33:31.960407 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-df2nq" Feb 17 09:33:31 crc kubenswrapper[4848]: I0217 09:33:31.980604 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-6gvlr"] Feb 17 09:33:31 crc kubenswrapper[4848]: I0217 09:33:31.987904 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9042f387-8534-4c6e-a64a-08154984ff7d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-6gvlr\" (UID: \"9042f387-8534-4c6e-a64a-08154984ff7d\") " pod="openstack/ssh-known-hosts-edpm-deployment-6gvlr" Feb 17 09:33:31 crc kubenswrapper[4848]: I0217 09:33:31.988282 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggxww\" (UniqueName: \"kubernetes.io/projected/9042f387-8534-4c6e-a64a-08154984ff7d-kube-api-access-ggxww\") pod \"ssh-known-hosts-edpm-deployment-6gvlr\" (UID: \"9042f387-8534-4c6e-a64a-08154984ff7d\") " pod="openstack/ssh-known-hosts-edpm-deployment-6gvlr" Feb 17 09:33:31 crc kubenswrapper[4848]: I0217 09:33:31.988538 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9042f387-8534-4c6e-a64a-08154984ff7d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-6gvlr\" (UID: \"9042f387-8534-4c6e-a64a-08154984ff7d\") " pod="openstack/ssh-known-hosts-edpm-deployment-6gvlr" Feb 17 09:33:32 crc kubenswrapper[4848]: I0217 09:33:32.090314 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9042f387-8534-4c6e-a64a-08154984ff7d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-6gvlr\" (UID: \"9042f387-8534-4c6e-a64a-08154984ff7d\") " pod="openstack/ssh-known-hosts-edpm-deployment-6gvlr" Feb 17 09:33:32 crc kubenswrapper[4848]: I0217 09:33:32.090439 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggxww\" (UniqueName: \"kubernetes.io/projected/9042f387-8534-4c6e-a64a-08154984ff7d-kube-api-access-ggxww\") pod \"ssh-known-hosts-edpm-deployment-6gvlr\" (UID: \"9042f387-8534-4c6e-a64a-08154984ff7d\") " pod="openstack/ssh-known-hosts-edpm-deployment-6gvlr" Feb 17 09:33:32 crc kubenswrapper[4848]: I0217 09:33:32.090516 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9042f387-8534-4c6e-a64a-08154984ff7d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-6gvlr\" (UID: \"9042f387-8534-4c6e-a64a-08154984ff7d\") " pod="openstack/ssh-known-hosts-edpm-deployment-6gvlr" Feb 17 09:33:32 crc kubenswrapper[4848]: I0217 09:33:32.094677 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9042f387-8534-4c6e-a64a-08154984ff7d-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-6gvlr\" (UID: \"9042f387-8534-4c6e-a64a-08154984ff7d\") " pod="openstack/ssh-known-hosts-edpm-deployment-6gvlr" Feb 17 09:33:32 crc kubenswrapper[4848]: I0217 09:33:32.095272 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9042f387-8534-4c6e-a64a-08154984ff7d-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-6gvlr\" (UID: \"9042f387-8534-4c6e-a64a-08154984ff7d\") " pod="openstack/ssh-known-hosts-edpm-deployment-6gvlr" Feb 17 09:33:32 crc kubenswrapper[4848]: I0217 09:33:32.114843 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggxww\" (UniqueName: \"kubernetes.io/projected/9042f387-8534-4c6e-a64a-08154984ff7d-kube-api-access-ggxww\") pod \"ssh-known-hosts-edpm-deployment-6gvlr\" (UID: \"9042f387-8534-4c6e-a64a-08154984ff7d\") " pod="openstack/ssh-known-hosts-edpm-deployment-6gvlr" Feb 17 09:33:32 crc kubenswrapper[4848]: I0217 09:33:32.275427 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-6gvlr" Feb 17 09:33:32 crc kubenswrapper[4848]: I0217 09:33:32.615923 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-6gvlr"] Feb 17 09:33:32 crc kubenswrapper[4848]: I0217 09:33:32.798949 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-6gvlr" event={"ID":"9042f387-8534-4c6e-a64a-08154984ff7d","Type":"ContainerStarted","Data":"93963e1e267f4fad83c0941e046c6416602a40b7a516a302893ff2c15cafa789"} Feb 17 09:33:33 crc kubenswrapper[4848]: I0217 09:33:33.814475 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-6gvlr" event={"ID":"9042f387-8534-4c6e-a64a-08154984ff7d","Type":"ContainerStarted","Data":"eb7afa9ef2a6f4227650289fc744451102e271edcf87ba55800b449dfacb60fd"} Feb 17 09:33:33 crc kubenswrapper[4848]: I0217 09:33:33.842711 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-6gvlr" podStartSLOduration=2.36024914 podStartE2EDuration="2.842684229s" podCreationTimestamp="2026-02-17 09:33:31 +0000 UTC" firstStartedPulling="2026-02-17 09:33:32.622974509 +0000 UTC m=+1690.166230155" lastFinishedPulling="2026-02-17 09:33:33.105409608 +0000 UTC m=+1690.648665244" observedRunningTime="2026-02-17 09:33:33.832527621 +0000 UTC m=+1691.375783287" watchObservedRunningTime="2026-02-17 09:33:33.842684229 +0000 UTC m=+1691.385939915" Feb 17 09:33:34 crc kubenswrapper[4848]: I0217 09:33:34.383227 4848 scope.go:117] "RemoveContainer" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" Feb 17 09:33:34 crc kubenswrapper[4848]: E0217 09:33:34.383659 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:33:39 crc kubenswrapper[4848]: I0217 09:33:39.865315 4848 generic.go:334] "Generic (PLEG): container finished" podID="9042f387-8534-4c6e-a64a-08154984ff7d" containerID="eb7afa9ef2a6f4227650289fc744451102e271edcf87ba55800b449dfacb60fd" exitCode=0 Feb 17 09:33:39 crc kubenswrapper[4848]: I0217 09:33:39.865409 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-6gvlr" event={"ID":"9042f387-8534-4c6e-a64a-08154984ff7d","Type":"ContainerDied","Data":"eb7afa9ef2a6f4227650289fc744451102e271edcf87ba55800b449dfacb60fd"} Feb 17 09:33:41 crc kubenswrapper[4848]: I0217 09:33:41.368931 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-6gvlr" Feb 17 09:33:41 crc kubenswrapper[4848]: I0217 09:33:41.489325 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9042f387-8534-4c6e-a64a-08154984ff7d-inventory-0\") pod \"9042f387-8534-4c6e-a64a-08154984ff7d\" (UID: \"9042f387-8534-4c6e-a64a-08154984ff7d\") " Feb 17 09:33:41 crc kubenswrapper[4848]: I0217 09:33:41.489456 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggxww\" (UniqueName: \"kubernetes.io/projected/9042f387-8534-4c6e-a64a-08154984ff7d-kube-api-access-ggxww\") pod \"9042f387-8534-4c6e-a64a-08154984ff7d\" (UID: \"9042f387-8534-4c6e-a64a-08154984ff7d\") " Feb 17 09:33:41 crc kubenswrapper[4848]: I0217 09:33:41.489607 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9042f387-8534-4c6e-a64a-08154984ff7d-ssh-key-openstack-edpm-ipam\") pod \"9042f387-8534-4c6e-a64a-08154984ff7d\" (UID: \"9042f387-8534-4c6e-a64a-08154984ff7d\") " Feb 17 09:33:41 crc kubenswrapper[4848]: I0217 09:33:41.499558 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9042f387-8534-4c6e-a64a-08154984ff7d-kube-api-access-ggxww" (OuterVolumeSpecName: "kube-api-access-ggxww") pod "9042f387-8534-4c6e-a64a-08154984ff7d" (UID: "9042f387-8534-4c6e-a64a-08154984ff7d"). InnerVolumeSpecName "kube-api-access-ggxww". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:33:41 crc kubenswrapper[4848]: I0217 09:33:41.518556 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9042f387-8534-4c6e-a64a-08154984ff7d-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "9042f387-8534-4c6e-a64a-08154984ff7d" (UID: "9042f387-8534-4c6e-a64a-08154984ff7d"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:33:41 crc kubenswrapper[4848]: I0217 09:33:41.535334 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9042f387-8534-4c6e-a64a-08154984ff7d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9042f387-8534-4c6e-a64a-08154984ff7d" (UID: "9042f387-8534-4c6e-a64a-08154984ff7d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:33:41 crc kubenswrapper[4848]: I0217 09:33:41.592188 4848 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/9042f387-8534-4c6e-a64a-08154984ff7d-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 17 09:33:41 crc kubenswrapper[4848]: I0217 09:33:41.592251 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggxww\" (UniqueName: \"kubernetes.io/projected/9042f387-8534-4c6e-a64a-08154984ff7d-kube-api-access-ggxww\") on node \"crc\" DevicePath \"\"" Feb 17 09:33:41 crc kubenswrapper[4848]: I0217 09:33:41.592281 4848 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9042f387-8534-4c6e-a64a-08154984ff7d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 09:33:41 crc kubenswrapper[4848]: I0217 09:33:41.884336 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-6gvlr" event={"ID":"9042f387-8534-4c6e-a64a-08154984ff7d","Type":"ContainerDied","Data":"93963e1e267f4fad83c0941e046c6416602a40b7a516a302893ff2c15cafa789"} Feb 17 09:33:41 crc kubenswrapper[4848]: I0217 09:33:41.884633 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93963e1e267f4fad83c0941e046c6416602a40b7a516a302893ff2c15cafa789" Feb 17 09:33:41 crc kubenswrapper[4848]: I0217 09:33:41.884394 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-6gvlr" Feb 17 09:33:41 crc kubenswrapper[4848]: I0217 09:33:41.979496 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-2vff5"] Feb 17 09:33:41 crc kubenswrapper[4848]: E0217 09:33:41.979882 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9042f387-8534-4c6e-a64a-08154984ff7d" containerName="ssh-known-hosts-edpm-deployment" Feb 17 09:33:41 crc kubenswrapper[4848]: I0217 09:33:41.979899 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9042f387-8534-4c6e-a64a-08154984ff7d" containerName="ssh-known-hosts-edpm-deployment" Feb 17 09:33:41 crc kubenswrapper[4848]: I0217 09:33:41.980081 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="9042f387-8534-4c6e-a64a-08154984ff7d" containerName="ssh-known-hosts-edpm-deployment" Feb 17 09:33:41 crc kubenswrapper[4848]: I0217 09:33:41.980740 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2vff5" Feb 17 09:33:41 crc kubenswrapper[4848]: I0217 09:33:41.982864 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 09:33:41 crc kubenswrapper[4848]: I0217 09:33:41.983425 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-df2nq" Feb 17 09:33:41 crc kubenswrapper[4848]: I0217 09:33:41.983581 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 09:33:41 crc kubenswrapper[4848]: I0217 09:33:41.984147 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 09:33:42 crc kubenswrapper[4848]: I0217 09:33:42.012701 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-2vff5"] Feb 17 09:33:42 crc kubenswrapper[4848]: I0217 09:33:42.104122 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/14190cb8-1489-4fb2-8c06-0eb40f1f584e-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2vff5\" (UID: \"14190cb8-1489-4fb2-8c06-0eb40f1f584e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2vff5" Feb 17 09:33:42 crc kubenswrapper[4848]: I0217 09:33:42.104312 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14190cb8-1489-4fb2-8c06-0eb40f1f584e-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2vff5\" (UID: \"14190cb8-1489-4fb2-8c06-0eb40f1f584e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2vff5" Feb 17 09:33:42 crc kubenswrapper[4848]: I0217 09:33:42.104368 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz9bv\" (UniqueName: \"kubernetes.io/projected/14190cb8-1489-4fb2-8c06-0eb40f1f584e-kube-api-access-vz9bv\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2vff5\" (UID: \"14190cb8-1489-4fb2-8c06-0eb40f1f584e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2vff5" Feb 17 09:33:42 crc kubenswrapper[4848]: I0217 09:33:42.206490 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14190cb8-1489-4fb2-8c06-0eb40f1f584e-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2vff5\" (UID: \"14190cb8-1489-4fb2-8c06-0eb40f1f584e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2vff5" Feb 17 09:33:42 crc kubenswrapper[4848]: I0217 09:33:42.206564 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz9bv\" (UniqueName: \"kubernetes.io/projected/14190cb8-1489-4fb2-8c06-0eb40f1f584e-kube-api-access-vz9bv\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2vff5\" (UID: \"14190cb8-1489-4fb2-8c06-0eb40f1f584e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2vff5" Feb 17 09:33:42 crc kubenswrapper[4848]: I0217 09:33:42.206630 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/14190cb8-1489-4fb2-8c06-0eb40f1f584e-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2vff5\" (UID: \"14190cb8-1489-4fb2-8c06-0eb40f1f584e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2vff5" Feb 17 09:33:42 crc kubenswrapper[4848]: I0217 09:33:42.211468 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/14190cb8-1489-4fb2-8c06-0eb40f1f584e-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2vff5\" (UID: \"14190cb8-1489-4fb2-8c06-0eb40f1f584e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2vff5" Feb 17 09:33:42 crc kubenswrapper[4848]: I0217 09:33:42.212226 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14190cb8-1489-4fb2-8c06-0eb40f1f584e-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2vff5\" (UID: \"14190cb8-1489-4fb2-8c06-0eb40f1f584e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2vff5" Feb 17 09:33:42 crc kubenswrapper[4848]: I0217 09:33:42.223146 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz9bv\" (UniqueName: \"kubernetes.io/projected/14190cb8-1489-4fb2-8c06-0eb40f1f584e-kube-api-access-vz9bv\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-2vff5\" (UID: \"14190cb8-1489-4fb2-8c06-0eb40f1f584e\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2vff5" Feb 17 09:33:42 crc kubenswrapper[4848]: I0217 09:33:42.314959 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2vff5" Feb 17 09:33:42 crc kubenswrapper[4848]: I0217 09:33:42.847012 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-2vff5"] Feb 17 09:33:42 crc kubenswrapper[4848]: I0217 09:33:42.899604 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2vff5" event={"ID":"14190cb8-1489-4fb2-8c06-0eb40f1f584e","Type":"ContainerStarted","Data":"de9320bc250f528fbd0e294ead33a66f7eab0300d9ca2ed05c99c53ea58bb05a"} Feb 17 09:33:43 crc kubenswrapper[4848]: I0217 09:33:43.911948 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2vff5" event={"ID":"14190cb8-1489-4fb2-8c06-0eb40f1f584e","Type":"ContainerStarted","Data":"19cbab338a5ffbe52779d0bf4babd717c95739a16883d8c7dfa6a416a5737809"} Feb 17 09:33:43 crc kubenswrapper[4848]: I0217 09:33:43.940529 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2vff5" podStartSLOduration=2.536155806 podStartE2EDuration="2.94051148s" podCreationTimestamp="2026-02-17 09:33:41 +0000 UTC" firstStartedPulling="2026-02-17 09:33:42.859931968 +0000 UTC m=+1700.403187624" lastFinishedPulling="2026-02-17 09:33:43.264287612 +0000 UTC m=+1700.807543298" observedRunningTime="2026-02-17 09:33:43.936676091 +0000 UTC m=+1701.479931797" watchObservedRunningTime="2026-02-17 09:33:43.94051148 +0000 UTC m=+1701.483767126" Feb 17 09:33:48 crc kubenswrapper[4848]: I0217 09:33:48.383959 4848 scope.go:117] "RemoveContainer" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" Feb 17 09:33:48 crc kubenswrapper[4848]: E0217 09:33:48.384950 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:33:50 crc kubenswrapper[4848]: I0217 09:33:50.042054 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-h9vlc"] Feb 17 09:33:50 crc kubenswrapper[4848]: I0217 09:33:50.052914 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-h9vlc"] Feb 17 09:33:51 crc kubenswrapper[4848]: I0217 09:33:51.041508 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zgwn9"] Feb 17 09:33:51 crc kubenswrapper[4848]: I0217 09:33:51.052938 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-zgwn9"] Feb 17 09:33:51 crc kubenswrapper[4848]: I0217 09:33:51.397999 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f95863d-9560-4508-9115-5da47d8dd4c2" path="/var/lib/kubelet/pods/5f95863d-9560-4508-9115-5da47d8dd4c2/volumes" Feb 17 09:33:51 crc kubenswrapper[4848]: I0217 09:33:51.399095 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a4198c9-c56f-45e9-90f7-486ddcb9d65f" path="/var/lib/kubelet/pods/7a4198c9-c56f-45e9-90f7-486ddcb9d65f/volumes" Feb 17 09:33:51 crc kubenswrapper[4848]: I0217 09:33:51.991838 4848 generic.go:334] "Generic (PLEG): container finished" podID="14190cb8-1489-4fb2-8c06-0eb40f1f584e" containerID="19cbab338a5ffbe52779d0bf4babd717c95739a16883d8c7dfa6a416a5737809" exitCode=0 Feb 17 09:33:51 crc kubenswrapper[4848]: I0217 09:33:51.991887 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2vff5" event={"ID":"14190cb8-1489-4fb2-8c06-0eb40f1f584e","Type":"ContainerDied","Data":"19cbab338a5ffbe52779d0bf4babd717c95739a16883d8c7dfa6a416a5737809"} Feb 17 09:33:53 crc kubenswrapper[4848]: I0217 09:33:53.462227 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2vff5" Feb 17 09:33:53 crc kubenswrapper[4848]: I0217 09:33:53.658448 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz9bv\" (UniqueName: \"kubernetes.io/projected/14190cb8-1489-4fb2-8c06-0eb40f1f584e-kube-api-access-vz9bv\") pod \"14190cb8-1489-4fb2-8c06-0eb40f1f584e\" (UID: \"14190cb8-1489-4fb2-8c06-0eb40f1f584e\") " Feb 17 09:33:53 crc kubenswrapper[4848]: I0217 09:33:53.658629 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/14190cb8-1489-4fb2-8c06-0eb40f1f584e-ssh-key-openstack-edpm-ipam\") pod \"14190cb8-1489-4fb2-8c06-0eb40f1f584e\" (UID: \"14190cb8-1489-4fb2-8c06-0eb40f1f584e\") " Feb 17 09:33:53 crc kubenswrapper[4848]: I0217 09:33:53.658781 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14190cb8-1489-4fb2-8c06-0eb40f1f584e-inventory\") pod \"14190cb8-1489-4fb2-8c06-0eb40f1f584e\" (UID: \"14190cb8-1489-4fb2-8c06-0eb40f1f584e\") " Feb 17 09:33:53 crc kubenswrapper[4848]: I0217 09:33:53.666555 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14190cb8-1489-4fb2-8c06-0eb40f1f584e-kube-api-access-vz9bv" (OuterVolumeSpecName: "kube-api-access-vz9bv") pod "14190cb8-1489-4fb2-8c06-0eb40f1f584e" (UID: "14190cb8-1489-4fb2-8c06-0eb40f1f584e"). InnerVolumeSpecName "kube-api-access-vz9bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:33:53 crc kubenswrapper[4848]: I0217 09:33:53.692260 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14190cb8-1489-4fb2-8c06-0eb40f1f584e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "14190cb8-1489-4fb2-8c06-0eb40f1f584e" (UID: "14190cb8-1489-4fb2-8c06-0eb40f1f584e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:33:53 crc kubenswrapper[4848]: I0217 09:33:53.692961 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14190cb8-1489-4fb2-8c06-0eb40f1f584e-inventory" (OuterVolumeSpecName: "inventory") pod "14190cb8-1489-4fb2-8c06-0eb40f1f584e" (UID: "14190cb8-1489-4fb2-8c06-0eb40f1f584e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:33:53 crc kubenswrapper[4848]: I0217 09:33:53.760627 4848 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/14190cb8-1489-4fb2-8c06-0eb40f1f584e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 09:33:53 crc kubenswrapper[4848]: I0217 09:33:53.760668 4848 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14190cb8-1489-4fb2-8c06-0eb40f1f584e-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 09:33:53 crc kubenswrapper[4848]: I0217 09:33:53.760682 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz9bv\" (UniqueName: \"kubernetes.io/projected/14190cb8-1489-4fb2-8c06-0eb40f1f584e-kube-api-access-vz9bv\") on node \"crc\" DevicePath \"\"" Feb 17 09:33:54 crc kubenswrapper[4848]: I0217 09:33:54.012591 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2vff5" event={"ID":"14190cb8-1489-4fb2-8c06-0eb40f1f584e","Type":"ContainerDied","Data":"de9320bc250f528fbd0e294ead33a66f7eab0300d9ca2ed05c99c53ea58bb05a"} Feb 17 09:33:54 crc kubenswrapper[4848]: I0217 09:33:54.012634 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-2vff5" Feb 17 09:33:54 crc kubenswrapper[4848]: I0217 09:33:54.012657 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de9320bc250f528fbd0e294ead33a66f7eab0300d9ca2ed05c99c53ea58bb05a" Feb 17 09:33:54 crc kubenswrapper[4848]: I0217 09:33:54.125590 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6"] Feb 17 09:33:54 crc kubenswrapper[4848]: E0217 09:33:54.126125 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14190cb8-1489-4fb2-8c06-0eb40f1f584e" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 17 09:33:54 crc kubenswrapper[4848]: I0217 09:33:54.126148 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="14190cb8-1489-4fb2-8c06-0eb40f1f584e" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 17 09:33:54 crc kubenswrapper[4848]: I0217 09:33:54.126424 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="14190cb8-1489-4fb2-8c06-0eb40f1f584e" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 17 09:33:54 crc kubenswrapper[4848]: I0217 09:33:54.131997 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6" Feb 17 09:33:54 crc kubenswrapper[4848]: I0217 09:33:54.134254 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 09:33:54 crc kubenswrapper[4848]: I0217 09:33:54.135119 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-df2nq" Feb 17 09:33:54 crc kubenswrapper[4848]: I0217 09:33:54.135234 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 09:33:54 crc kubenswrapper[4848]: I0217 09:33:54.135262 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 09:33:54 crc kubenswrapper[4848]: I0217 09:33:54.137991 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6"] Feb 17 09:33:54 crc kubenswrapper[4848]: I0217 09:33:54.181073 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmh7w\" (UniqueName: \"kubernetes.io/projected/3230b202-405b-4545-b04f-8c01231f565e-kube-api-access-kmh7w\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6\" (UID: \"3230b202-405b-4545-b04f-8c01231f565e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6" Feb 17 09:33:54 crc kubenswrapper[4848]: I0217 09:33:54.181256 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3230b202-405b-4545-b04f-8c01231f565e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6\" (UID: \"3230b202-405b-4545-b04f-8c01231f565e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6" Feb 17 09:33:54 crc kubenswrapper[4848]: I0217 09:33:54.181294 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3230b202-405b-4545-b04f-8c01231f565e-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6\" (UID: \"3230b202-405b-4545-b04f-8c01231f565e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6" Feb 17 09:33:54 crc kubenswrapper[4848]: I0217 09:33:54.282969 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3230b202-405b-4545-b04f-8c01231f565e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6\" (UID: \"3230b202-405b-4545-b04f-8c01231f565e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6" Feb 17 09:33:54 crc kubenswrapper[4848]: I0217 09:33:54.283071 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3230b202-405b-4545-b04f-8c01231f565e-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6\" (UID: \"3230b202-405b-4545-b04f-8c01231f565e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6" Feb 17 09:33:54 crc kubenswrapper[4848]: I0217 09:33:54.283155 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmh7w\" (UniqueName: \"kubernetes.io/projected/3230b202-405b-4545-b04f-8c01231f565e-kube-api-access-kmh7w\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6\" (UID: \"3230b202-405b-4545-b04f-8c01231f565e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6" Feb 17 09:33:54 crc kubenswrapper[4848]: I0217 09:33:54.288130 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3230b202-405b-4545-b04f-8c01231f565e-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6\" (UID: \"3230b202-405b-4545-b04f-8c01231f565e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6" Feb 17 09:33:54 crc kubenswrapper[4848]: I0217 09:33:54.288281 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3230b202-405b-4545-b04f-8c01231f565e-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6\" (UID: \"3230b202-405b-4545-b04f-8c01231f565e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6" Feb 17 09:33:54 crc kubenswrapper[4848]: I0217 09:33:54.303306 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmh7w\" (UniqueName: \"kubernetes.io/projected/3230b202-405b-4545-b04f-8c01231f565e-kube-api-access-kmh7w\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6\" (UID: \"3230b202-405b-4545-b04f-8c01231f565e\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6" Feb 17 09:33:54 crc kubenswrapper[4848]: I0217 09:33:54.452611 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6" Feb 17 09:33:55 crc kubenswrapper[4848]: I0217 09:33:55.061134 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6"] Feb 17 09:33:56 crc kubenswrapper[4848]: I0217 09:33:56.033457 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6" event={"ID":"3230b202-405b-4545-b04f-8c01231f565e","Type":"ContainerStarted","Data":"d24aa5b71647beb6d4cbabafe4d739ee51e637b3f27b031a7f2d70338155548a"} Feb 17 09:33:56 crc kubenswrapper[4848]: I0217 09:33:56.033937 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6" event={"ID":"3230b202-405b-4545-b04f-8c01231f565e","Type":"ContainerStarted","Data":"5b177461bf931beb41c0b59aaab6a6536b5313a021ff20e9266ec6b929bcfb55"} Feb 17 09:33:56 crc kubenswrapper[4848]: I0217 09:33:56.073964 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6" podStartSLOduration=1.645129615 podStartE2EDuration="2.073935053s" podCreationTimestamp="2026-02-17 09:33:54 +0000 UTC" firstStartedPulling="2026-02-17 09:33:55.079907087 +0000 UTC m=+1712.623162753" lastFinishedPulling="2026-02-17 09:33:55.508712545 +0000 UTC m=+1713.051968191" observedRunningTime="2026-02-17 09:33:56.051386853 +0000 UTC m=+1713.594642499" watchObservedRunningTime="2026-02-17 09:33:56.073935053 +0000 UTC m=+1713.617190729" Feb 17 09:34:02 crc kubenswrapper[4848]: I0217 09:34:02.385137 4848 scope.go:117] "RemoveContainer" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" Feb 17 09:34:02 crc kubenswrapper[4848]: E0217 09:34:02.386503 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:34:05 crc kubenswrapper[4848]: I0217 09:34:05.141129 4848 generic.go:334] "Generic (PLEG): container finished" podID="3230b202-405b-4545-b04f-8c01231f565e" containerID="d24aa5b71647beb6d4cbabafe4d739ee51e637b3f27b031a7f2d70338155548a" exitCode=0 Feb 17 09:34:05 crc kubenswrapper[4848]: I0217 09:34:05.141222 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6" event={"ID":"3230b202-405b-4545-b04f-8c01231f565e","Type":"ContainerDied","Data":"d24aa5b71647beb6d4cbabafe4d739ee51e637b3f27b031a7f2d70338155548a"} Feb 17 09:34:06 crc kubenswrapper[4848]: I0217 09:34:06.618885 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6" Feb 17 09:34:06 crc kubenswrapper[4848]: I0217 09:34:06.635600 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3230b202-405b-4545-b04f-8c01231f565e-inventory\") pod \"3230b202-405b-4545-b04f-8c01231f565e\" (UID: \"3230b202-405b-4545-b04f-8c01231f565e\") " Feb 17 09:34:06 crc kubenswrapper[4848]: I0217 09:34:06.635828 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3230b202-405b-4545-b04f-8c01231f565e-ssh-key-openstack-edpm-ipam\") pod \"3230b202-405b-4545-b04f-8c01231f565e\" (UID: \"3230b202-405b-4545-b04f-8c01231f565e\") " Feb 17 09:34:06 crc kubenswrapper[4848]: I0217 09:34:06.635882 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmh7w\" (UniqueName: \"kubernetes.io/projected/3230b202-405b-4545-b04f-8c01231f565e-kube-api-access-kmh7w\") pod \"3230b202-405b-4545-b04f-8c01231f565e\" (UID: \"3230b202-405b-4545-b04f-8c01231f565e\") " Feb 17 09:34:06 crc kubenswrapper[4848]: I0217 09:34:06.645780 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3230b202-405b-4545-b04f-8c01231f565e-kube-api-access-kmh7w" (OuterVolumeSpecName: "kube-api-access-kmh7w") pod "3230b202-405b-4545-b04f-8c01231f565e" (UID: "3230b202-405b-4545-b04f-8c01231f565e"). InnerVolumeSpecName "kube-api-access-kmh7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:34:06 crc kubenswrapper[4848]: I0217 09:34:06.669064 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3230b202-405b-4545-b04f-8c01231f565e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3230b202-405b-4545-b04f-8c01231f565e" (UID: "3230b202-405b-4545-b04f-8c01231f565e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:34:06 crc kubenswrapper[4848]: I0217 09:34:06.671597 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3230b202-405b-4545-b04f-8c01231f565e-inventory" (OuterVolumeSpecName: "inventory") pod "3230b202-405b-4545-b04f-8c01231f565e" (UID: "3230b202-405b-4545-b04f-8c01231f565e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:34:06 crc kubenswrapper[4848]: I0217 09:34:06.738605 4848 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3230b202-405b-4545-b04f-8c01231f565e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 09:34:06 crc kubenswrapper[4848]: I0217 09:34:06.738691 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmh7w\" (UniqueName: \"kubernetes.io/projected/3230b202-405b-4545-b04f-8c01231f565e-kube-api-access-kmh7w\") on node \"crc\" DevicePath \"\"" Feb 17 09:34:06 crc kubenswrapper[4848]: I0217 09:34:06.738712 4848 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3230b202-405b-4545-b04f-8c01231f565e-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.160292 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6" event={"ID":"3230b202-405b-4545-b04f-8c01231f565e","Type":"ContainerDied","Data":"5b177461bf931beb41c0b59aaab6a6536b5313a021ff20e9266ec6b929bcfb55"} Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.160335 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b177461bf931beb41c0b59aaab6a6536b5313a021ff20e9266ec6b929bcfb55" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.160380 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.256350 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s"] Feb 17 09:34:07 crc kubenswrapper[4848]: E0217 09:34:07.257228 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3230b202-405b-4545-b04f-8c01231f565e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.257318 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="3230b202-405b-4545-b04f-8c01231f565e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.257614 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="3230b202-405b-4545-b04f-8c01231f565e" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.258381 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.263833 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.264189 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.264352 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.264515 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-df2nq" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.264670 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.264741 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.264744 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.265076 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.279023 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s"] Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.347199 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.347255 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.347378 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.347455 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.347477 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.347536 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.347603 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.347646 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.347696 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.347751 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.347796 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.347822 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.347866 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n7fk\" (UniqueName: \"kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-kube-api-access-7n7fk\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.347893 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.449721 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.449790 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.449830 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.449872 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.449905 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.449960 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.450005 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.450037 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.450063 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.450105 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n7fk\" (UniqueName: \"kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-kube-api-access-7n7fk\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.450132 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.450377 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.451021 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.451059 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.454598 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.456031 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.456136 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.456511 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.457597 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.458199 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.461568 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.466214 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.466250 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.468011 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.468202 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n7fk\" (UniqueName: \"kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-kube-api-access-7n7fk\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.468311 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.468965 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.469712 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:07 crc kubenswrapper[4848]: I0217 09:34:07.577089 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:08 crc kubenswrapper[4848]: I0217 09:34:08.102836 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s"] Feb 17 09:34:08 crc kubenswrapper[4848]: W0217 09:34:08.107889 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b1521fa_5ab7_4b0a_b6f3_0c810ca21c34.slice/crio-770f0bda3984d2b9ddced208d8df543ca131ca04da6472361c85211550e8b01d WatchSource:0}: Error finding container 770f0bda3984d2b9ddced208d8df543ca131ca04da6472361c85211550e8b01d: Status 404 returned error can't find the container with id 770f0bda3984d2b9ddced208d8df543ca131ca04da6472361c85211550e8b01d Feb 17 09:34:08 crc kubenswrapper[4848]: I0217 09:34:08.169917 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" event={"ID":"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34","Type":"ContainerStarted","Data":"770f0bda3984d2b9ddced208d8df543ca131ca04da6472361c85211550e8b01d"} Feb 17 09:34:09 crc kubenswrapper[4848]: I0217 09:34:09.181783 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" event={"ID":"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34","Type":"ContainerStarted","Data":"325844d40b29f96a5537cf411e5914ecc57c5a91f15dfe54f0031a8e9edaf0bd"} Feb 17 09:34:09 crc kubenswrapper[4848]: I0217 09:34:09.217788 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" podStartSLOduration=1.776946828 podStartE2EDuration="2.217745756s" podCreationTimestamp="2026-02-17 09:34:07 +0000 UTC" firstStartedPulling="2026-02-17 09:34:08.110040384 +0000 UTC m=+1725.653296020" lastFinishedPulling="2026-02-17 09:34:08.550839302 +0000 UTC m=+1726.094094948" observedRunningTime="2026-02-17 09:34:09.209570244 +0000 UTC m=+1726.752825910" watchObservedRunningTime="2026-02-17 09:34:09.217745756 +0000 UTC m=+1726.761001412" Feb 17 09:34:16 crc kubenswrapper[4848]: I0217 09:34:16.384179 4848 scope.go:117] "RemoveContainer" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" Feb 17 09:34:16 crc kubenswrapper[4848]: E0217 09:34:16.385295 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:34:28 crc kubenswrapper[4848]: I0217 09:34:28.117302 4848 scope.go:117] "RemoveContainer" containerID="9cd17a4bb61419d7e3274e5c1432e3ebe90d529a0e6040483e8cb4d1168587ee" Feb 17 09:34:28 crc kubenswrapper[4848]: I0217 09:34:28.171834 4848 scope.go:117] "RemoveContainer" containerID="13a3939801e3fe31e0d33cb6fb4431afa3e5044ec2189bea83022200dd7894fb" Feb 17 09:34:28 crc kubenswrapper[4848]: I0217 09:34:28.235361 4848 scope.go:117] "RemoveContainer" containerID="d5214704be6a231b3d99cc7ed4e4eb92e94a850437d836cae3b99d1131d1cb01" Feb 17 09:34:29 crc kubenswrapper[4848]: I0217 09:34:29.383924 4848 scope.go:117] "RemoveContainer" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" Feb 17 09:34:29 crc kubenswrapper[4848]: E0217 09:34:29.385060 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:34:34 crc kubenswrapper[4848]: I0217 09:34:34.045122 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-fkfjb"] Feb 17 09:34:34 crc kubenswrapper[4848]: I0217 09:34:34.051959 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-fkfjb"] Feb 17 09:34:35 crc kubenswrapper[4848]: I0217 09:34:35.393645 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a25dbc2d-c651-434c-b30b-0ee52c27d295" path="/var/lib/kubelet/pods/a25dbc2d-c651-434c-b30b-0ee52c27d295/volumes" Feb 17 09:34:42 crc kubenswrapper[4848]: I0217 09:34:42.384957 4848 scope.go:117] "RemoveContainer" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" Feb 17 09:34:42 crc kubenswrapper[4848]: E0217 09:34:42.385616 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:34:42 crc kubenswrapper[4848]: I0217 09:34:42.491823 4848 generic.go:334] "Generic (PLEG): container finished" podID="3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34" containerID="325844d40b29f96a5537cf411e5914ecc57c5a91f15dfe54f0031a8e9edaf0bd" exitCode=0 Feb 17 09:34:42 crc kubenswrapper[4848]: I0217 09:34:42.491904 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" event={"ID":"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34","Type":"ContainerDied","Data":"325844d40b29f96a5537cf411e5914ecc57c5a91f15dfe54f0031a8e9edaf0bd"} Feb 17 09:34:43 crc kubenswrapper[4848]: I0217 09:34:43.992561 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.045614 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-neutron-metadata-combined-ca-bundle\") pod \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.045676 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-openstack-edpm-ipam-ovn-default-certs-0\") pod \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.045723 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-repo-setup-combined-ca-bundle\") pod \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.045741 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-ssh-key-openstack-edpm-ipam\") pod \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.045773 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-ovn-combined-ca-bundle\") pod \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.045800 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-telemetry-combined-ca-bundle\") pod \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.045869 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n7fk\" (UniqueName: \"kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-kube-api-access-7n7fk\") pod \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.045975 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-nova-combined-ca-bundle\") pod \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.046013 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.046036 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-inventory\") pod \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.046055 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.046115 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-libvirt-combined-ca-bundle\") pod \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.046145 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-bootstrap-combined-ca-bundle\") pod \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.046187 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\" (UID: \"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34\") " Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.059382 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34" (UID: "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.059476 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34" (UID: "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.059549 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34" (UID: "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.059654 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34" (UID: "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.059688 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34" (UID: "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.059910 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34" (UID: "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.060647 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34" (UID: "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.063272 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34" (UID: "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.064282 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34" (UID: "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.066452 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34" (UID: "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.070414 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34" (UID: "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.077953 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-kube-api-access-7n7fk" (OuterVolumeSpecName: "kube-api-access-7n7fk") pod "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34" (UID: "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34"). InnerVolumeSpecName "kube-api-access-7n7fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.085337 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34" (UID: "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.096435 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-inventory" (OuterVolumeSpecName: "inventory") pod "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34" (UID: "3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.149282 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n7fk\" (UniqueName: \"kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-kube-api-access-7n7fk\") on node \"crc\" DevicePath \"\"" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.149539 4848 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.149552 4848 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.149563 4848 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.149575 4848 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.149583 4848 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.149592 4848 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.149601 4848 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.149609 4848 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.149619 4848 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.149628 4848 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.149636 4848 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.149646 4848 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.149653 4848 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.518715 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" event={"ID":"3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34","Type":"ContainerDied","Data":"770f0bda3984d2b9ddced208d8df543ca131ca04da6472361c85211550e8b01d"} Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.518857 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="770f0bda3984d2b9ddced208d8df543ca131ca04da6472361c85211550e8b01d" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.518953 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.633609 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l"] Feb 17 09:34:44 crc kubenswrapper[4848]: E0217 09:34:44.633983 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.633998 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.634180 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.634793 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.641463 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l"] Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.642527 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.643896 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.643960 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.644109 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.644456 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-df2nq" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.763389 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3db20d69-cab8-4176-a71c-172899e90c3d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mq87l\" (UID: \"3db20d69-cab8-4176-a71c-172899e90c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.763463 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db20d69-cab8-4176-a71c-172899e90c3d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mq87l\" (UID: \"3db20d69-cab8-4176-a71c-172899e90c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.763487 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3db20d69-cab8-4176-a71c-172899e90c3d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mq87l\" (UID: \"3db20d69-cab8-4176-a71c-172899e90c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.763528 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3db20d69-cab8-4176-a71c-172899e90c3d-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mq87l\" (UID: \"3db20d69-cab8-4176-a71c-172899e90c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.763613 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h44gc\" (UniqueName: \"kubernetes.io/projected/3db20d69-cab8-4176-a71c-172899e90c3d-kube-api-access-h44gc\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mq87l\" (UID: \"3db20d69-cab8-4176-a71c-172899e90c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.865621 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db20d69-cab8-4176-a71c-172899e90c3d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mq87l\" (UID: \"3db20d69-cab8-4176-a71c-172899e90c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.865671 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3db20d69-cab8-4176-a71c-172899e90c3d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mq87l\" (UID: \"3db20d69-cab8-4176-a71c-172899e90c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.865736 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3db20d69-cab8-4176-a71c-172899e90c3d-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mq87l\" (UID: \"3db20d69-cab8-4176-a71c-172899e90c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.865868 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h44gc\" (UniqueName: \"kubernetes.io/projected/3db20d69-cab8-4176-a71c-172899e90c3d-kube-api-access-h44gc\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mq87l\" (UID: \"3db20d69-cab8-4176-a71c-172899e90c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.865917 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3db20d69-cab8-4176-a71c-172899e90c3d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mq87l\" (UID: \"3db20d69-cab8-4176-a71c-172899e90c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.867462 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3db20d69-cab8-4176-a71c-172899e90c3d-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mq87l\" (UID: \"3db20d69-cab8-4176-a71c-172899e90c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.872506 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3db20d69-cab8-4176-a71c-172899e90c3d-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mq87l\" (UID: \"3db20d69-cab8-4176-a71c-172899e90c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.872691 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3db20d69-cab8-4176-a71c-172899e90c3d-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mq87l\" (UID: \"3db20d69-cab8-4176-a71c-172899e90c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.875334 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db20d69-cab8-4176-a71c-172899e90c3d-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mq87l\" (UID: \"3db20d69-cab8-4176-a71c-172899e90c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.895145 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h44gc\" (UniqueName: \"kubernetes.io/projected/3db20d69-cab8-4176-a71c-172899e90c3d-kube-api-access-h44gc\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-mq87l\" (UID: \"3db20d69-cab8-4176-a71c-172899e90c3d\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l" Feb 17 09:34:44 crc kubenswrapper[4848]: I0217 09:34:44.954299 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l" Feb 17 09:34:45 crc kubenswrapper[4848]: I0217 09:34:45.550275 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l"] Feb 17 09:34:46 crc kubenswrapper[4848]: I0217 09:34:46.537671 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l" event={"ID":"3db20d69-cab8-4176-a71c-172899e90c3d","Type":"ContainerStarted","Data":"c3709824b511d1ed16a024c1d85c65b1eacd15f5260c1ee2d0f8198412b28f84"} Feb 17 09:34:46 crc kubenswrapper[4848]: I0217 09:34:46.538043 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l" event={"ID":"3db20d69-cab8-4176-a71c-172899e90c3d","Type":"ContainerStarted","Data":"0df42b9745410407c7b9bb091f3bd88bb8ef32a8847f2fb152e6304cf8731e25"} Feb 17 09:34:46 crc kubenswrapper[4848]: I0217 09:34:46.574460 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l" podStartSLOduration=2.179830806 podStartE2EDuration="2.574433003s" podCreationTimestamp="2026-02-17 09:34:44 +0000 UTC" firstStartedPulling="2026-02-17 09:34:45.561253183 +0000 UTC m=+1763.104508849" lastFinishedPulling="2026-02-17 09:34:45.95585539 +0000 UTC m=+1763.499111046" observedRunningTime="2026-02-17 09:34:46.559643153 +0000 UTC m=+1764.102898839" watchObservedRunningTime="2026-02-17 09:34:46.574433003 +0000 UTC m=+1764.117688679" Feb 17 09:34:54 crc kubenswrapper[4848]: I0217 09:34:54.385110 4848 scope.go:117] "RemoveContainer" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" Feb 17 09:34:54 crc kubenswrapper[4848]: E0217 09:34:54.387255 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:35:05 crc kubenswrapper[4848]: I0217 09:35:05.384065 4848 scope.go:117] "RemoveContainer" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" Feb 17 09:35:05 crc kubenswrapper[4848]: E0217 09:35:05.385265 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:35:18 crc kubenswrapper[4848]: I0217 09:35:18.386245 4848 scope.go:117] "RemoveContainer" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" Feb 17 09:35:18 crc kubenswrapper[4848]: E0217 09:35:18.388233 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:35:28 crc kubenswrapper[4848]: I0217 09:35:28.326672 4848 scope.go:117] "RemoveContainer" containerID="c44e7dddfdf0890d05394d3acecbc3114bdcde1a72f2117b6e7be9ca20880852" Feb 17 09:35:29 crc kubenswrapper[4848]: I0217 09:35:29.383970 4848 scope.go:117] "RemoveContainer" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" Feb 17 09:35:29 crc kubenswrapper[4848]: E0217 09:35:29.384998 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:35:41 crc kubenswrapper[4848]: I0217 09:35:41.383142 4848 scope.go:117] "RemoveContainer" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" Feb 17 09:35:41 crc kubenswrapper[4848]: E0217 09:35:41.384055 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:35:45 crc kubenswrapper[4848]: I0217 09:35:45.429905 4848 generic.go:334] "Generic (PLEG): container finished" podID="3db20d69-cab8-4176-a71c-172899e90c3d" containerID="c3709824b511d1ed16a024c1d85c65b1eacd15f5260c1ee2d0f8198412b28f84" exitCode=0 Feb 17 09:35:45 crc kubenswrapper[4848]: I0217 09:35:45.430044 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l" event={"ID":"3db20d69-cab8-4176-a71c-172899e90c3d","Type":"ContainerDied","Data":"c3709824b511d1ed16a024c1d85c65b1eacd15f5260c1ee2d0f8198412b28f84"} Feb 17 09:35:46 crc kubenswrapper[4848]: I0217 09:35:46.978818 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.056320 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3db20d69-cab8-4176-a71c-172899e90c3d-inventory\") pod \"3db20d69-cab8-4176-a71c-172899e90c3d\" (UID: \"3db20d69-cab8-4176-a71c-172899e90c3d\") " Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.056646 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3db20d69-cab8-4176-a71c-172899e90c3d-ovncontroller-config-0\") pod \"3db20d69-cab8-4176-a71c-172899e90c3d\" (UID: \"3db20d69-cab8-4176-a71c-172899e90c3d\") " Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.056929 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3db20d69-cab8-4176-a71c-172899e90c3d-ssh-key-openstack-edpm-ipam\") pod \"3db20d69-cab8-4176-a71c-172899e90c3d\" (UID: \"3db20d69-cab8-4176-a71c-172899e90c3d\") " Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.057106 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h44gc\" (UniqueName: \"kubernetes.io/projected/3db20d69-cab8-4176-a71c-172899e90c3d-kube-api-access-h44gc\") pod \"3db20d69-cab8-4176-a71c-172899e90c3d\" (UID: \"3db20d69-cab8-4176-a71c-172899e90c3d\") " Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.057656 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db20d69-cab8-4176-a71c-172899e90c3d-ovn-combined-ca-bundle\") pod \"3db20d69-cab8-4176-a71c-172899e90c3d\" (UID: \"3db20d69-cab8-4176-a71c-172899e90c3d\") " Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.062725 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db20d69-cab8-4176-a71c-172899e90c3d-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "3db20d69-cab8-4176-a71c-172899e90c3d" (UID: "3db20d69-cab8-4176-a71c-172899e90c3d"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.063894 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3db20d69-cab8-4176-a71c-172899e90c3d-kube-api-access-h44gc" (OuterVolumeSpecName: "kube-api-access-h44gc") pod "3db20d69-cab8-4176-a71c-172899e90c3d" (UID: "3db20d69-cab8-4176-a71c-172899e90c3d"). InnerVolumeSpecName "kube-api-access-h44gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.084900 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db20d69-cab8-4176-a71c-172899e90c3d-inventory" (OuterVolumeSpecName: "inventory") pod "3db20d69-cab8-4176-a71c-172899e90c3d" (UID: "3db20d69-cab8-4176-a71c-172899e90c3d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.088896 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3db20d69-cab8-4176-a71c-172899e90c3d-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "3db20d69-cab8-4176-a71c-172899e90c3d" (UID: "3db20d69-cab8-4176-a71c-172899e90c3d"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.098687 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3db20d69-cab8-4176-a71c-172899e90c3d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3db20d69-cab8-4176-a71c-172899e90c3d" (UID: "3db20d69-cab8-4176-a71c-172899e90c3d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.159327 4848 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/3db20d69-cab8-4176-a71c-172899e90c3d-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.159360 4848 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3db20d69-cab8-4176-a71c-172899e90c3d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.159373 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h44gc\" (UniqueName: \"kubernetes.io/projected/3db20d69-cab8-4176-a71c-172899e90c3d-kube-api-access-h44gc\") on node \"crc\" DevicePath \"\"" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.159383 4848 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3db20d69-cab8-4176-a71c-172899e90c3d-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.159395 4848 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3db20d69-cab8-4176-a71c-172899e90c3d-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.464697 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l" event={"ID":"3db20d69-cab8-4176-a71c-172899e90c3d","Type":"ContainerDied","Data":"0df42b9745410407c7b9bb091f3bd88bb8ef32a8847f2fb152e6304cf8731e25"} Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.464795 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0df42b9745410407c7b9bb091f3bd88bb8ef32a8847f2fb152e6304cf8731e25" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.464853 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-mq87l" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.573332 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b"] Feb 17 09:35:47 crc kubenswrapper[4848]: E0217 09:35:47.574284 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3db20d69-cab8-4176-a71c-172899e90c3d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.574318 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="3db20d69-cab8-4176-a71c-172899e90c3d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.574611 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="3db20d69-cab8-4176-a71c-172899e90c3d" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.575319 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.579000 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.579252 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-df2nq" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.580406 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.584178 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.584481 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.584841 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.602304 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b"] Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.671217 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b\" (UID: \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.671284 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjdx8\" (UniqueName: \"kubernetes.io/projected/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-kube-api-access-kjdx8\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b\" (UID: \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.671312 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b\" (UID: \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.671357 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b\" (UID: \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.671397 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b\" (UID: \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.671563 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b\" (UID: \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.772909 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b\" (UID: \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.772998 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjdx8\" (UniqueName: \"kubernetes.io/projected/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-kube-api-access-kjdx8\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b\" (UID: \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.773033 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b\" (UID: \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.773094 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b\" (UID: \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.773163 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b\" (UID: \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.773204 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b\" (UID: \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.779723 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b\" (UID: \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.780471 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b\" (UID: \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.781445 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b\" (UID: \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.782044 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b\" (UID: \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.787689 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b\" (UID: \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.794668 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjdx8\" (UniqueName: \"kubernetes.io/projected/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-kube-api-access-kjdx8\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b\" (UID: \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" Feb 17 09:35:47 crc kubenswrapper[4848]: I0217 09:35:47.925192 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" Feb 17 09:35:48 crc kubenswrapper[4848]: I0217 09:35:48.576489 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b"] Feb 17 09:35:49 crc kubenswrapper[4848]: I0217 09:35:49.487017 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" event={"ID":"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92","Type":"ContainerStarted","Data":"ff73ea007ea81b7cbf814948928811981ce00e48f6d2ee2bdcd776ef3d7234ad"} Feb 17 09:35:49 crc kubenswrapper[4848]: I0217 09:35:49.487424 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" event={"ID":"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92","Type":"ContainerStarted","Data":"50101962ee5392c5bc76386f782edc0178f2b958231c7cd951d10078f4fec0e3"} Feb 17 09:35:49 crc kubenswrapper[4848]: I0217 09:35:49.518466 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" podStartSLOduration=2.037026974 podStartE2EDuration="2.518400603s" podCreationTimestamp="2026-02-17 09:35:47 +0000 UTC" firstStartedPulling="2026-02-17 09:35:48.577021381 +0000 UTC m=+1826.120277037" lastFinishedPulling="2026-02-17 09:35:49.05839503 +0000 UTC m=+1826.601650666" observedRunningTime="2026-02-17 09:35:49.505800646 +0000 UTC m=+1827.049056382" watchObservedRunningTime="2026-02-17 09:35:49.518400603 +0000 UTC m=+1827.061656289" Feb 17 09:35:53 crc kubenswrapper[4848]: I0217 09:35:53.389356 4848 scope.go:117] "RemoveContainer" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" Feb 17 09:35:53 crc kubenswrapper[4848]: E0217 09:35:53.391102 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:36:06 crc kubenswrapper[4848]: I0217 09:36:06.384393 4848 scope.go:117] "RemoveContainer" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" Feb 17 09:36:06 crc kubenswrapper[4848]: E0217 09:36:06.385794 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:36:17 crc kubenswrapper[4848]: I0217 09:36:17.384364 4848 scope.go:117] "RemoveContainer" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" Feb 17 09:36:17 crc kubenswrapper[4848]: E0217 09:36:17.385480 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:36:32 crc kubenswrapper[4848]: I0217 09:36:32.383750 4848 scope.go:117] "RemoveContainer" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" Feb 17 09:36:32 crc kubenswrapper[4848]: E0217 09:36:32.384944 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:36:33 crc kubenswrapper[4848]: I0217 09:36:33.961536 4848 generic.go:334] "Generic (PLEG): container finished" podID="cb01f719-b45c-48ab-ba4a-6ffeef0d8b92" containerID="ff73ea007ea81b7cbf814948928811981ce00e48f6d2ee2bdcd776ef3d7234ad" exitCode=0 Feb 17 09:36:33 crc kubenswrapper[4848]: I0217 09:36:33.961612 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" event={"ID":"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92","Type":"ContainerDied","Data":"ff73ea007ea81b7cbf814948928811981ce00e48f6d2ee2bdcd776ef3d7234ad"} Feb 17 09:36:35 crc kubenswrapper[4848]: I0217 09:36:35.397667 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" Feb 17 09:36:35 crc kubenswrapper[4848]: I0217 09:36:35.419687 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-inventory\") pod \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\" (UID: \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\") " Feb 17 09:36:35 crc kubenswrapper[4848]: I0217 09:36:35.420379 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-nova-metadata-neutron-config-0\") pod \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\" (UID: \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\") " Feb 17 09:36:35 crc kubenswrapper[4848]: I0217 09:36:35.420548 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-neutron-ovn-metadata-agent-neutron-config-0\") pod \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\" (UID: \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\") " Feb 17 09:36:35 crc kubenswrapper[4848]: I0217 09:36:35.420700 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjdx8\" (UniqueName: \"kubernetes.io/projected/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-kube-api-access-kjdx8\") pod \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\" (UID: \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\") " Feb 17 09:36:35 crc kubenswrapper[4848]: I0217 09:36:35.420977 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-neutron-metadata-combined-ca-bundle\") pod \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\" (UID: \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\") " Feb 17 09:36:35 crc kubenswrapper[4848]: I0217 09:36:35.421304 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-ssh-key-openstack-edpm-ipam\") pod \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\" (UID: \"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92\") " Feb 17 09:36:35 crc kubenswrapper[4848]: I0217 09:36:35.432816 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-kube-api-access-kjdx8" (OuterVolumeSpecName: "kube-api-access-kjdx8") pod "cb01f719-b45c-48ab-ba4a-6ffeef0d8b92" (UID: "cb01f719-b45c-48ab-ba4a-6ffeef0d8b92"). InnerVolumeSpecName "kube-api-access-kjdx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:36:35 crc kubenswrapper[4848]: I0217 09:36:35.452102 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "cb01f719-b45c-48ab-ba4a-6ffeef0d8b92" (UID: "cb01f719-b45c-48ab-ba4a-6ffeef0d8b92"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:36:35 crc kubenswrapper[4848]: I0217 09:36:35.454242 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "cb01f719-b45c-48ab-ba4a-6ffeef0d8b92" (UID: "cb01f719-b45c-48ab-ba4a-6ffeef0d8b92"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:36:35 crc kubenswrapper[4848]: I0217 09:36:35.468488 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-inventory" (OuterVolumeSpecName: "inventory") pod "cb01f719-b45c-48ab-ba4a-6ffeef0d8b92" (UID: "cb01f719-b45c-48ab-ba4a-6ffeef0d8b92"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:36:35 crc kubenswrapper[4848]: I0217 09:36:35.471503 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "cb01f719-b45c-48ab-ba4a-6ffeef0d8b92" (UID: "cb01f719-b45c-48ab-ba4a-6ffeef0d8b92"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:36:35 crc kubenswrapper[4848]: I0217 09:36:35.488273 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cb01f719-b45c-48ab-ba4a-6ffeef0d8b92" (UID: "cb01f719-b45c-48ab-ba4a-6ffeef0d8b92"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:36:35 crc kubenswrapper[4848]: I0217 09:36:35.529103 4848 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 09:36:35 crc kubenswrapper[4848]: I0217 09:36:35.529152 4848 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 09:36:35 crc kubenswrapper[4848]: I0217 09:36:35.529164 4848 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 09:36:35 crc kubenswrapper[4848]: I0217 09:36:35.529182 4848 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 09:36:35 crc kubenswrapper[4848]: I0217 09:36:35.529195 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjdx8\" (UniqueName: \"kubernetes.io/projected/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-kube-api-access-kjdx8\") on node \"crc\" DevicePath \"\"" Feb 17 09:36:35 crc kubenswrapper[4848]: I0217 09:36:35.529208 4848 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb01f719-b45c-48ab-ba4a-6ffeef0d8b92-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:36:35 crc kubenswrapper[4848]: I0217 09:36:35.984712 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" event={"ID":"cb01f719-b45c-48ab-ba4a-6ffeef0d8b92","Type":"ContainerDied","Data":"50101962ee5392c5bc76386f782edc0178f2b958231c7cd951d10078f4fec0e3"} Feb 17 09:36:35 crc kubenswrapper[4848]: I0217 09:36:35.984792 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50101962ee5392c5bc76386f782edc0178f2b958231c7cd951d10078f4fec0e3" Feb 17 09:36:35 crc kubenswrapper[4848]: I0217 09:36:35.984850 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b" Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.126448 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk"] Feb 17 09:36:36 crc kubenswrapper[4848]: E0217 09:36:36.127217 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb01f719-b45c-48ab-ba4a-6ffeef0d8b92" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.127250 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb01f719-b45c-48ab-ba4a-6ffeef0d8b92" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.127679 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb01f719-b45c-48ab-ba4a-6ffeef0d8b92" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.129276 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk" Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.132147 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-df2nq" Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.132189 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.132202 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.132268 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.132258 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.140583 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a694769e-5bc0-4596-945c-2de9823168f0-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk\" (UID: \"a694769e-5bc0-4596-945c-2de9823168f0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk" Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.140745 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a694769e-5bc0-4596-945c-2de9823168f0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk\" (UID: \"a694769e-5bc0-4596-945c-2de9823168f0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk" Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.140854 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a694769e-5bc0-4596-945c-2de9823168f0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk\" (UID: \"a694769e-5bc0-4596-945c-2de9823168f0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk" Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.140894 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a694769e-5bc0-4596-945c-2de9823168f0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk\" (UID: \"a694769e-5bc0-4596-945c-2de9823168f0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk" Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.140922 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6vwl\" (UniqueName: \"kubernetes.io/projected/a694769e-5bc0-4596-945c-2de9823168f0-kube-api-access-q6vwl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk\" (UID: \"a694769e-5bc0-4596-945c-2de9823168f0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk" Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.156797 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk"] Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.242918 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a694769e-5bc0-4596-945c-2de9823168f0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk\" (UID: \"a694769e-5bc0-4596-945c-2de9823168f0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk" Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.242992 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a694769e-5bc0-4596-945c-2de9823168f0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk\" (UID: \"a694769e-5bc0-4596-945c-2de9823168f0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk" Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.243019 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6vwl\" (UniqueName: \"kubernetes.io/projected/a694769e-5bc0-4596-945c-2de9823168f0-kube-api-access-q6vwl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk\" (UID: \"a694769e-5bc0-4596-945c-2de9823168f0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk" Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.243069 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a694769e-5bc0-4596-945c-2de9823168f0-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk\" (UID: \"a694769e-5bc0-4596-945c-2de9823168f0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk" Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.243197 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a694769e-5bc0-4596-945c-2de9823168f0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk\" (UID: \"a694769e-5bc0-4596-945c-2de9823168f0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk" Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.247023 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a694769e-5bc0-4596-945c-2de9823168f0-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk\" (UID: \"a694769e-5bc0-4596-945c-2de9823168f0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk" Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.247163 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a694769e-5bc0-4596-945c-2de9823168f0-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk\" (UID: \"a694769e-5bc0-4596-945c-2de9823168f0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk" Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.247223 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a694769e-5bc0-4596-945c-2de9823168f0-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk\" (UID: \"a694769e-5bc0-4596-945c-2de9823168f0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk" Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.249660 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a694769e-5bc0-4596-945c-2de9823168f0-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk\" (UID: \"a694769e-5bc0-4596-945c-2de9823168f0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk" Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.273216 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6vwl\" (UniqueName: \"kubernetes.io/projected/a694769e-5bc0-4596-945c-2de9823168f0-kube-api-access-q6vwl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk\" (UID: \"a694769e-5bc0-4596-945c-2de9823168f0\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk" Feb 17 09:36:36 crc kubenswrapper[4848]: I0217 09:36:36.446054 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk" Feb 17 09:36:37 crc kubenswrapper[4848]: I0217 09:36:37.023277 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk"] Feb 17 09:36:37 crc kubenswrapper[4848]: I0217 09:36:37.043608 4848 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 09:36:38 crc kubenswrapper[4848]: I0217 09:36:38.013013 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk" event={"ID":"a694769e-5bc0-4596-945c-2de9823168f0","Type":"ContainerStarted","Data":"30fdba1047782a9005f70e10ed50d20a81beb7402903dfd25a72ee179f280583"} Feb 17 09:36:38 crc kubenswrapper[4848]: I0217 09:36:38.013330 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk" event={"ID":"a694769e-5bc0-4596-945c-2de9823168f0","Type":"ContainerStarted","Data":"c0b387d0d330e65976ccc577e381d11fa8f88b7fdf165b690a2de74cf74defa6"} Feb 17 09:36:38 crc kubenswrapper[4848]: I0217 09:36:38.044446 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk" podStartSLOduration=1.618797638 podStartE2EDuration="2.044425425s" podCreationTimestamp="2026-02-17 09:36:36 +0000 UTC" firstStartedPulling="2026-02-17 09:36:37.042718552 +0000 UTC m=+1874.585974248" lastFinishedPulling="2026-02-17 09:36:37.468346389 +0000 UTC m=+1875.011602035" observedRunningTime="2026-02-17 09:36:38.031607901 +0000 UTC m=+1875.574863587" watchObservedRunningTime="2026-02-17 09:36:38.044425425 +0000 UTC m=+1875.587681081" Feb 17 09:36:47 crc kubenswrapper[4848]: I0217 09:36:47.384250 4848 scope.go:117] "RemoveContainer" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" Feb 17 09:36:47 crc kubenswrapper[4848]: E0217 09:36:47.385471 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:37:01 crc kubenswrapper[4848]: I0217 09:37:01.383307 4848 scope.go:117] "RemoveContainer" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" Feb 17 09:37:01 crc kubenswrapper[4848]: E0217 09:37:01.383990 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:37:12 crc kubenswrapper[4848]: I0217 09:37:12.384197 4848 scope.go:117] "RemoveContainer" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" Feb 17 09:37:12 crc kubenswrapper[4848]: E0217 09:37:12.385186 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:37:27 crc kubenswrapper[4848]: I0217 09:37:27.383999 4848 scope.go:117] "RemoveContainer" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" Feb 17 09:37:27 crc kubenswrapper[4848]: E0217 09:37:27.385093 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:37:40 crc kubenswrapper[4848]: I0217 09:37:40.384731 4848 scope.go:117] "RemoveContainer" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" Feb 17 09:37:40 crc kubenswrapper[4848]: E0217 09:37:40.385617 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:37:53 crc kubenswrapper[4848]: I0217 09:37:53.400058 4848 scope.go:117] "RemoveContainer" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" Feb 17 09:37:53 crc kubenswrapper[4848]: E0217 09:37:53.401555 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:38:04 crc kubenswrapper[4848]: I0217 09:38:04.385097 4848 scope.go:117] "RemoveContainer" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" Feb 17 09:38:04 crc kubenswrapper[4848]: E0217 09:38:04.386442 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:38:19 crc kubenswrapper[4848]: I0217 09:38:19.384604 4848 scope.go:117] "RemoveContainer" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" Feb 17 09:38:20 crc kubenswrapper[4848]: I0217 09:38:20.204064 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerStarted","Data":"7fd570aced053602ec77ca215312b917159892f88fee9de807e73bca6b19517b"} Feb 17 09:38:54 crc kubenswrapper[4848]: I0217 09:38:54.958654 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f7q2k"] Feb 17 09:38:54 crc kubenswrapper[4848]: I0217 09:38:54.963882 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f7q2k" Feb 17 09:38:54 crc kubenswrapper[4848]: I0217 09:38:54.983419 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f7q2k"] Feb 17 09:38:55 crc kubenswrapper[4848]: I0217 09:38:55.083348 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zs9t\" (UniqueName: \"kubernetes.io/projected/17314677-36a2-48b1-999e-5917bd3cc804-kube-api-access-5zs9t\") pod \"redhat-operators-f7q2k\" (UID: \"17314677-36a2-48b1-999e-5917bd3cc804\") " pod="openshift-marketplace/redhat-operators-f7q2k" Feb 17 09:38:55 crc kubenswrapper[4848]: I0217 09:38:55.083814 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17314677-36a2-48b1-999e-5917bd3cc804-utilities\") pod \"redhat-operators-f7q2k\" (UID: \"17314677-36a2-48b1-999e-5917bd3cc804\") " pod="openshift-marketplace/redhat-operators-f7q2k" Feb 17 09:38:55 crc kubenswrapper[4848]: I0217 09:38:55.083901 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17314677-36a2-48b1-999e-5917bd3cc804-catalog-content\") pod \"redhat-operators-f7q2k\" (UID: \"17314677-36a2-48b1-999e-5917bd3cc804\") " pod="openshift-marketplace/redhat-operators-f7q2k" Feb 17 09:38:55 crc kubenswrapper[4848]: I0217 09:38:55.185471 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zs9t\" (UniqueName: \"kubernetes.io/projected/17314677-36a2-48b1-999e-5917bd3cc804-kube-api-access-5zs9t\") pod \"redhat-operators-f7q2k\" (UID: \"17314677-36a2-48b1-999e-5917bd3cc804\") " pod="openshift-marketplace/redhat-operators-f7q2k" Feb 17 09:38:55 crc kubenswrapper[4848]: I0217 09:38:55.185621 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17314677-36a2-48b1-999e-5917bd3cc804-utilities\") pod \"redhat-operators-f7q2k\" (UID: \"17314677-36a2-48b1-999e-5917bd3cc804\") " pod="openshift-marketplace/redhat-operators-f7q2k" Feb 17 09:38:55 crc kubenswrapper[4848]: I0217 09:38:55.185707 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17314677-36a2-48b1-999e-5917bd3cc804-catalog-content\") pod \"redhat-operators-f7q2k\" (UID: \"17314677-36a2-48b1-999e-5917bd3cc804\") " pod="openshift-marketplace/redhat-operators-f7q2k" Feb 17 09:38:55 crc kubenswrapper[4848]: I0217 09:38:55.186157 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17314677-36a2-48b1-999e-5917bd3cc804-utilities\") pod \"redhat-operators-f7q2k\" (UID: \"17314677-36a2-48b1-999e-5917bd3cc804\") " pod="openshift-marketplace/redhat-operators-f7q2k" Feb 17 09:38:55 crc kubenswrapper[4848]: I0217 09:38:55.186180 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17314677-36a2-48b1-999e-5917bd3cc804-catalog-content\") pod \"redhat-operators-f7q2k\" (UID: \"17314677-36a2-48b1-999e-5917bd3cc804\") " pod="openshift-marketplace/redhat-operators-f7q2k" Feb 17 09:38:55 crc kubenswrapper[4848]: I0217 09:38:55.204879 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zs9t\" (UniqueName: \"kubernetes.io/projected/17314677-36a2-48b1-999e-5917bd3cc804-kube-api-access-5zs9t\") pod \"redhat-operators-f7q2k\" (UID: \"17314677-36a2-48b1-999e-5917bd3cc804\") " pod="openshift-marketplace/redhat-operators-f7q2k" Feb 17 09:38:55 crc kubenswrapper[4848]: I0217 09:38:55.296761 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f7q2k" Feb 17 09:38:55 crc kubenswrapper[4848]: I0217 09:38:55.774269 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f7q2k"] Feb 17 09:38:55 crc kubenswrapper[4848]: W0217 09:38:55.778896 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17314677_36a2_48b1_999e_5917bd3cc804.slice/crio-650b39571bbac0b1b7a6ad0180e9a3e7b50f4a8350a8c5c29e3a7bbf08d9d08a WatchSource:0}: Error finding container 650b39571bbac0b1b7a6ad0180e9a3e7b50f4a8350a8c5c29e3a7bbf08d9d08a: Status 404 returned error can't find the container with id 650b39571bbac0b1b7a6ad0180e9a3e7b50f4a8350a8c5c29e3a7bbf08d9d08a Feb 17 09:38:56 crc kubenswrapper[4848]: I0217 09:38:56.628748 4848 generic.go:334] "Generic (PLEG): container finished" podID="17314677-36a2-48b1-999e-5917bd3cc804" containerID="bddefa8dbf0fe3694b262532aedc7a02fc7ce796b9eb05db096e471315aebd55" exitCode=0 Feb 17 09:38:56 crc kubenswrapper[4848]: I0217 09:38:56.629425 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7q2k" event={"ID":"17314677-36a2-48b1-999e-5917bd3cc804","Type":"ContainerDied","Data":"bddefa8dbf0fe3694b262532aedc7a02fc7ce796b9eb05db096e471315aebd55"} Feb 17 09:38:56 crc kubenswrapper[4848]: I0217 09:38:56.629522 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7q2k" event={"ID":"17314677-36a2-48b1-999e-5917bd3cc804","Type":"ContainerStarted","Data":"650b39571bbac0b1b7a6ad0180e9a3e7b50f4a8350a8c5c29e3a7bbf08d9d08a"} Feb 17 09:38:57 crc kubenswrapper[4848]: I0217 09:38:57.640584 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7q2k" event={"ID":"17314677-36a2-48b1-999e-5917bd3cc804","Type":"ContainerStarted","Data":"d07cd86647e562d0eb1f13ffa24a90de3623394a6caa5cd1147eca9f4f031081"} Feb 17 09:38:59 crc kubenswrapper[4848]: I0217 09:38:59.656158 4848 generic.go:334] "Generic (PLEG): container finished" podID="17314677-36a2-48b1-999e-5917bd3cc804" containerID="d07cd86647e562d0eb1f13ffa24a90de3623394a6caa5cd1147eca9f4f031081" exitCode=0 Feb 17 09:38:59 crc kubenswrapper[4848]: I0217 09:38:59.656212 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7q2k" event={"ID":"17314677-36a2-48b1-999e-5917bd3cc804","Type":"ContainerDied","Data":"d07cd86647e562d0eb1f13ffa24a90de3623394a6caa5cd1147eca9f4f031081"} Feb 17 09:39:00 crc kubenswrapper[4848]: I0217 09:39:00.667887 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7q2k" event={"ID":"17314677-36a2-48b1-999e-5917bd3cc804","Type":"ContainerStarted","Data":"b7ff35be3251cc30457b06d1cb80c55066dc11e921c8d25045068816b42fad21"} Feb 17 09:39:00 crc kubenswrapper[4848]: I0217 09:39:00.689779 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f7q2k" podStartSLOduration=3.061544681 podStartE2EDuration="6.689751878s" podCreationTimestamp="2026-02-17 09:38:54 +0000 UTC" firstStartedPulling="2026-02-17 09:38:56.632965429 +0000 UTC m=+2014.176221115" lastFinishedPulling="2026-02-17 09:39:00.261172666 +0000 UTC m=+2017.804428312" observedRunningTime="2026-02-17 09:39:00.687200875 +0000 UTC m=+2018.230456541" watchObservedRunningTime="2026-02-17 09:39:00.689751878 +0000 UTC m=+2018.233007524" Feb 17 09:39:01 crc kubenswrapper[4848]: I0217 09:39:01.918806 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-68t75"] Feb 17 09:39:01 crc kubenswrapper[4848]: I0217 09:39:01.921383 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-68t75" Feb 17 09:39:01 crc kubenswrapper[4848]: I0217 09:39:01.944724 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-68t75"] Feb 17 09:39:02 crc kubenswrapper[4848]: I0217 09:39:02.021406 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w99gx\" (UniqueName: \"kubernetes.io/projected/e4be5ad7-b624-46ba-8dc2-eef530b47800-kube-api-access-w99gx\") pod \"certified-operators-68t75\" (UID: \"e4be5ad7-b624-46ba-8dc2-eef530b47800\") " pod="openshift-marketplace/certified-operators-68t75" Feb 17 09:39:02 crc kubenswrapper[4848]: I0217 09:39:02.021554 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4be5ad7-b624-46ba-8dc2-eef530b47800-utilities\") pod \"certified-operators-68t75\" (UID: \"e4be5ad7-b624-46ba-8dc2-eef530b47800\") " pod="openshift-marketplace/certified-operators-68t75" Feb 17 09:39:02 crc kubenswrapper[4848]: I0217 09:39:02.021571 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4be5ad7-b624-46ba-8dc2-eef530b47800-catalog-content\") pod \"certified-operators-68t75\" (UID: \"e4be5ad7-b624-46ba-8dc2-eef530b47800\") " pod="openshift-marketplace/certified-operators-68t75" Feb 17 09:39:02 crc kubenswrapper[4848]: I0217 09:39:02.122611 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4be5ad7-b624-46ba-8dc2-eef530b47800-utilities\") pod \"certified-operators-68t75\" (UID: \"e4be5ad7-b624-46ba-8dc2-eef530b47800\") " pod="openshift-marketplace/certified-operators-68t75" Feb 17 09:39:02 crc kubenswrapper[4848]: I0217 09:39:02.122651 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4be5ad7-b624-46ba-8dc2-eef530b47800-catalog-content\") pod \"certified-operators-68t75\" (UID: \"e4be5ad7-b624-46ba-8dc2-eef530b47800\") " pod="openshift-marketplace/certified-operators-68t75" Feb 17 09:39:02 crc kubenswrapper[4848]: I0217 09:39:02.122700 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w99gx\" (UniqueName: \"kubernetes.io/projected/e4be5ad7-b624-46ba-8dc2-eef530b47800-kube-api-access-w99gx\") pod \"certified-operators-68t75\" (UID: \"e4be5ad7-b624-46ba-8dc2-eef530b47800\") " pod="openshift-marketplace/certified-operators-68t75" Feb 17 09:39:02 crc kubenswrapper[4848]: I0217 09:39:02.123165 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4be5ad7-b624-46ba-8dc2-eef530b47800-utilities\") pod \"certified-operators-68t75\" (UID: \"e4be5ad7-b624-46ba-8dc2-eef530b47800\") " pod="openshift-marketplace/certified-operators-68t75" Feb 17 09:39:02 crc kubenswrapper[4848]: I0217 09:39:02.123239 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4be5ad7-b624-46ba-8dc2-eef530b47800-catalog-content\") pod \"certified-operators-68t75\" (UID: \"e4be5ad7-b624-46ba-8dc2-eef530b47800\") " pod="openshift-marketplace/certified-operators-68t75" Feb 17 09:39:02 crc kubenswrapper[4848]: I0217 09:39:02.145291 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w99gx\" (UniqueName: \"kubernetes.io/projected/e4be5ad7-b624-46ba-8dc2-eef530b47800-kube-api-access-w99gx\") pod \"certified-operators-68t75\" (UID: \"e4be5ad7-b624-46ba-8dc2-eef530b47800\") " pod="openshift-marketplace/certified-operators-68t75" Feb 17 09:39:02 crc kubenswrapper[4848]: I0217 09:39:02.250513 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-68t75" Feb 17 09:39:02 crc kubenswrapper[4848]: I0217 09:39:02.814406 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-68t75"] Feb 17 09:39:02 crc kubenswrapper[4848]: W0217 09:39:02.833572 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4be5ad7_b624_46ba_8dc2_eef530b47800.slice/crio-6016bc8a6e869a011516fb20b55b4ccaa69919f47798c9358e99b75adbd935af WatchSource:0}: Error finding container 6016bc8a6e869a011516fb20b55b4ccaa69919f47798c9358e99b75adbd935af: Status 404 returned error can't find the container with id 6016bc8a6e869a011516fb20b55b4ccaa69919f47798c9358e99b75adbd935af Feb 17 09:39:03 crc kubenswrapper[4848]: I0217 09:39:03.691983 4848 generic.go:334] "Generic (PLEG): container finished" podID="e4be5ad7-b624-46ba-8dc2-eef530b47800" containerID="a06b17da805f21969d4aa9a80696b2221d08de654d36cac937e597f6e36047b8" exitCode=0 Feb 17 09:39:03 crc kubenswrapper[4848]: I0217 09:39:03.692053 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68t75" event={"ID":"e4be5ad7-b624-46ba-8dc2-eef530b47800","Type":"ContainerDied","Data":"a06b17da805f21969d4aa9a80696b2221d08de654d36cac937e597f6e36047b8"} Feb 17 09:39:03 crc kubenswrapper[4848]: I0217 09:39:03.692264 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68t75" event={"ID":"e4be5ad7-b624-46ba-8dc2-eef530b47800","Type":"ContainerStarted","Data":"6016bc8a6e869a011516fb20b55b4ccaa69919f47798c9358e99b75adbd935af"} Feb 17 09:39:04 crc kubenswrapper[4848]: I0217 09:39:04.717327 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68t75" event={"ID":"e4be5ad7-b624-46ba-8dc2-eef530b47800","Type":"ContainerStarted","Data":"6945ae1ce5c09fddf567cf13962bffab054cbfbf4f6e99ddcdd387666a915dc4"} Feb 17 09:39:05 crc kubenswrapper[4848]: I0217 09:39:05.297173 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f7q2k" Feb 17 09:39:05 crc kubenswrapper[4848]: I0217 09:39:05.297229 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f7q2k" Feb 17 09:39:05 crc kubenswrapper[4848]: I0217 09:39:05.726043 4848 generic.go:334] "Generic (PLEG): container finished" podID="e4be5ad7-b624-46ba-8dc2-eef530b47800" containerID="6945ae1ce5c09fddf567cf13962bffab054cbfbf4f6e99ddcdd387666a915dc4" exitCode=0 Feb 17 09:39:05 crc kubenswrapper[4848]: I0217 09:39:05.726158 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68t75" event={"ID":"e4be5ad7-b624-46ba-8dc2-eef530b47800","Type":"ContainerDied","Data":"6945ae1ce5c09fddf567cf13962bffab054cbfbf4f6e99ddcdd387666a915dc4"} Feb 17 09:39:06 crc kubenswrapper[4848]: I0217 09:39:06.374000 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f7q2k" podUID="17314677-36a2-48b1-999e-5917bd3cc804" containerName="registry-server" probeResult="failure" output=< Feb 17 09:39:06 crc kubenswrapper[4848]: timeout: failed to connect service ":50051" within 1s Feb 17 09:39:06 crc kubenswrapper[4848]: > Feb 17 09:39:06 crc kubenswrapper[4848]: I0217 09:39:06.738617 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68t75" event={"ID":"e4be5ad7-b624-46ba-8dc2-eef530b47800","Type":"ContainerStarted","Data":"f672b4f3d1589ef6b0ae11c284375ca036f1dd10fee7635b1197979a0be0d11d"} Feb 17 09:39:06 crc kubenswrapper[4848]: I0217 09:39:06.766525 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-68t75" podStartSLOduration=3.35244246 podStartE2EDuration="5.766505875s" podCreationTimestamp="2026-02-17 09:39:01 +0000 UTC" firstStartedPulling="2026-02-17 09:39:03.693655938 +0000 UTC m=+2021.236911584" lastFinishedPulling="2026-02-17 09:39:06.107719353 +0000 UTC m=+2023.650974999" observedRunningTime="2026-02-17 09:39:06.756621142 +0000 UTC m=+2024.299876798" watchObservedRunningTime="2026-02-17 09:39:06.766505875 +0000 UTC m=+2024.309761531" Feb 17 09:39:12 crc kubenswrapper[4848]: I0217 09:39:12.251343 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-68t75" Feb 17 09:39:12 crc kubenswrapper[4848]: I0217 09:39:12.251994 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-68t75" Feb 17 09:39:12 crc kubenswrapper[4848]: I0217 09:39:12.304687 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-68t75" Feb 17 09:39:12 crc kubenswrapper[4848]: I0217 09:39:12.840704 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-68t75" Feb 17 09:39:12 crc kubenswrapper[4848]: I0217 09:39:12.920255 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-68t75"] Feb 17 09:39:14 crc kubenswrapper[4848]: I0217 09:39:14.807536 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-68t75" podUID="e4be5ad7-b624-46ba-8dc2-eef530b47800" containerName="registry-server" containerID="cri-o://f672b4f3d1589ef6b0ae11c284375ca036f1dd10fee7635b1197979a0be0d11d" gracePeriod=2 Feb 17 09:39:15 crc kubenswrapper[4848]: I0217 09:39:15.311369 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-68t75" Feb 17 09:39:15 crc kubenswrapper[4848]: I0217 09:39:15.470990 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4be5ad7-b624-46ba-8dc2-eef530b47800-catalog-content\") pod \"e4be5ad7-b624-46ba-8dc2-eef530b47800\" (UID: \"e4be5ad7-b624-46ba-8dc2-eef530b47800\") " Feb 17 09:39:15 crc kubenswrapper[4848]: I0217 09:39:15.471402 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w99gx\" (UniqueName: \"kubernetes.io/projected/e4be5ad7-b624-46ba-8dc2-eef530b47800-kube-api-access-w99gx\") pod \"e4be5ad7-b624-46ba-8dc2-eef530b47800\" (UID: \"e4be5ad7-b624-46ba-8dc2-eef530b47800\") " Feb 17 09:39:15 crc kubenswrapper[4848]: I0217 09:39:15.471459 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4be5ad7-b624-46ba-8dc2-eef530b47800-utilities\") pod \"e4be5ad7-b624-46ba-8dc2-eef530b47800\" (UID: \"e4be5ad7-b624-46ba-8dc2-eef530b47800\") " Feb 17 09:39:15 crc kubenswrapper[4848]: I0217 09:39:15.472044 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4be5ad7-b624-46ba-8dc2-eef530b47800-utilities" (OuterVolumeSpecName: "utilities") pod "e4be5ad7-b624-46ba-8dc2-eef530b47800" (UID: "e4be5ad7-b624-46ba-8dc2-eef530b47800"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:39:15 crc kubenswrapper[4848]: I0217 09:39:15.480969 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4be5ad7-b624-46ba-8dc2-eef530b47800-kube-api-access-w99gx" (OuterVolumeSpecName: "kube-api-access-w99gx") pod "e4be5ad7-b624-46ba-8dc2-eef530b47800" (UID: "e4be5ad7-b624-46ba-8dc2-eef530b47800"). InnerVolumeSpecName "kube-api-access-w99gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:39:15 crc kubenswrapper[4848]: I0217 09:39:15.539038 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4be5ad7-b624-46ba-8dc2-eef530b47800-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4be5ad7-b624-46ba-8dc2-eef530b47800" (UID: "e4be5ad7-b624-46ba-8dc2-eef530b47800"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:39:15 crc kubenswrapper[4848]: I0217 09:39:15.574167 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w99gx\" (UniqueName: \"kubernetes.io/projected/e4be5ad7-b624-46ba-8dc2-eef530b47800-kube-api-access-w99gx\") on node \"crc\" DevicePath \"\"" Feb 17 09:39:15 crc kubenswrapper[4848]: I0217 09:39:15.574203 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4be5ad7-b624-46ba-8dc2-eef530b47800-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:39:15 crc kubenswrapper[4848]: I0217 09:39:15.574216 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4be5ad7-b624-46ba-8dc2-eef530b47800-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:39:15 crc kubenswrapper[4848]: I0217 09:39:15.822959 4848 generic.go:334] "Generic (PLEG): container finished" podID="e4be5ad7-b624-46ba-8dc2-eef530b47800" containerID="f672b4f3d1589ef6b0ae11c284375ca036f1dd10fee7635b1197979a0be0d11d" exitCode=0 Feb 17 09:39:15 crc kubenswrapper[4848]: I0217 09:39:15.823049 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68t75" event={"ID":"e4be5ad7-b624-46ba-8dc2-eef530b47800","Type":"ContainerDied","Data":"f672b4f3d1589ef6b0ae11c284375ca036f1dd10fee7635b1197979a0be0d11d"} Feb 17 09:39:15 crc kubenswrapper[4848]: I0217 09:39:15.823117 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-68t75" Feb 17 09:39:15 crc kubenswrapper[4848]: I0217 09:39:15.823961 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-68t75" event={"ID":"e4be5ad7-b624-46ba-8dc2-eef530b47800","Type":"ContainerDied","Data":"6016bc8a6e869a011516fb20b55b4ccaa69919f47798c9358e99b75adbd935af"} Feb 17 09:39:15 crc kubenswrapper[4848]: I0217 09:39:15.823987 4848 scope.go:117] "RemoveContainer" containerID="f672b4f3d1589ef6b0ae11c284375ca036f1dd10fee7635b1197979a0be0d11d" Feb 17 09:39:15 crc kubenswrapper[4848]: I0217 09:39:15.845146 4848 scope.go:117] "RemoveContainer" containerID="6945ae1ce5c09fddf567cf13962bffab054cbfbf4f6e99ddcdd387666a915dc4" Feb 17 09:39:15 crc kubenswrapper[4848]: I0217 09:39:15.875026 4848 scope.go:117] "RemoveContainer" containerID="a06b17da805f21969d4aa9a80696b2221d08de654d36cac937e597f6e36047b8" Feb 17 09:39:15 crc kubenswrapper[4848]: I0217 09:39:15.879846 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-68t75"] Feb 17 09:39:15 crc kubenswrapper[4848]: I0217 09:39:15.889853 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-68t75"] Feb 17 09:39:15 crc kubenswrapper[4848]: I0217 09:39:15.928870 4848 scope.go:117] "RemoveContainer" containerID="f672b4f3d1589ef6b0ae11c284375ca036f1dd10fee7635b1197979a0be0d11d" Feb 17 09:39:15 crc kubenswrapper[4848]: E0217 09:39:15.929249 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f672b4f3d1589ef6b0ae11c284375ca036f1dd10fee7635b1197979a0be0d11d\": container with ID starting with f672b4f3d1589ef6b0ae11c284375ca036f1dd10fee7635b1197979a0be0d11d not found: ID does not exist" containerID="f672b4f3d1589ef6b0ae11c284375ca036f1dd10fee7635b1197979a0be0d11d" Feb 17 09:39:15 crc kubenswrapper[4848]: I0217 09:39:15.929294 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f672b4f3d1589ef6b0ae11c284375ca036f1dd10fee7635b1197979a0be0d11d"} err="failed to get container status \"f672b4f3d1589ef6b0ae11c284375ca036f1dd10fee7635b1197979a0be0d11d\": rpc error: code = NotFound desc = could not find container \"f672b4f3d1589ef6b0ae11c284375ca036f1dd10fee7635b1197979a0be0d11d\": container with ID starting with f672b4f3d1589ef6b0ae11c284375ca036f1dd10fee7635b1197979a0be0d11d not found: ID does not exist" Feb 17 09:39:15 crc kubenswrapper[4848]: I0217 09:39:15.929325 4848 scope.go:117] "RemoveContainer" containerID="6945ae1ce5c09fddf567cf13962bffab054cbfbf4f6e99ddcdd387666a915dc4" Feb 17 09:39:15 crc kubenswrapper[4848]: E0217 09:39:15.929706 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6945ae1ce5c09fddf567cf13962bffab054cbfbf4f6e99ddcdd387666a915dc4\": container with ID starting with 6945ae1ce5c09fddf567cf13962bffab054cbfbf4f6e99ddcdd387666a915dc4 not found: ID does not exist" containerID="6945ae1ce5c09fddf567cf13962bffab054cbfbf4f6e99ddcdd387666a915dc4" Feb 17 09:39:15 crc kubenswrapper[4848]: I0217 09:39:15.929746 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6945ae1ce5c09fddf567cf13962bffab054cbfbf4f6e99ddcdd387666a915dc4"} err="failed to get container status \"6945ae1ce5c09fddf567cf13962bffab054cbfbf4f6e99ddcdd387666a915dc4\": rpc error: code = NotFound desc = could not find container \"6945ae1ce5c09fddf567cf13962bffab054cbfbf4f6e99ddcdd387666a915dc4\": container with ID starting with 6945ae1ce5c09fddf567cf13962bffab054cbfbf4f6e99ddcdd387666a915dc4 not found: ID does not exist" Feb 17 09:39:15 crc kubenswrapper[4848]: I0217 09:39:15.929822 4848 scope.go:117] "RemoveContainer" containerID="a06b17da805f21969d4aa9a80696b2221d08de654d36cac937e597f6e36047b8" Feb 17 09:39:15 crc kubenswrapper[4848]: E0217 09:39:15.930287 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a06b17da805f21969d4aa9a80696b2221d08de654d36cac937e597f6e36047b8\": container with ID starting with a06b17da805f21969d4aa9a80696b2221d08de654d36cac937e597f6e36047b8 not found: ID does not exist" containerID="a06b17da805f21969d4aa9a80696b2221d08de654d36cac937e597f6e36047b8" Feb 17 09:39:15 crc kubenswrapper[4848]: I0217 09:39:15.930320 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a06b17da805f21969d4aa9a80696b2221d08de654d36cac937e597f6e36047b8"} err="failed to get container status \"a06b17da805f21969d4aa9a80696b2221d08de654d36cac937e597f6e36047b8\": rpc error: code = NotFound desc = could not find container \"a06b17da805f21969d4aa9a80696b2221d08de654d36cac937e597f6e36047b8\": container with ID starting with a06b17da805f21969d4aa9a80696b2221d08de654d36cac937e597f6e36047b8 not found: ID does not exist" Feb 17 09:39:16 crc kubenswrapper[4848]: I0217 09:39:16.363119 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f7q2k" podUID="17314677-36a2-48b1-999e-5917bd3cc804" containerName="registry-server" probeResult="failure" output=< Feb 17 09:39:16 crc kubenswrapper[4848]: timeout: failed to connect service ":50051" within 1s Feb 17 09:39:16 crc kubenswrapper[4848]: > Feb 17 09:39:17 crc kubenswrapper[4848]: I0217 09:39:17.406118 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4be5ad7-b624-46ba-8dc2-eef530b47800" path="/var/lib/kubelet/pods/e4be5ad7-b624-46ba-8dc2-eef530b47800/volumes" Feb 17 09:39:26 crc kubenswrapper[4848]: I0217 09:39:26.365024 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f7q2k" podUID="17314677-36a2-48b1-999e-5917bd3cc804" containerName="registry-server" probeResult="failure" output=< Feb 17 09:39:26 crc kubenswrapper[4848]: timeout: failed to connect service ":50051" within 1s Feb 17 09:39:26 crc kubenswrapper[4848]: > Feb 17 09:39:35 crc kubenswrapper[4848]: I0217 09:39:35.395578 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f7q2k" Feb 17 09:39:35 crc kubenswrapper[4848]: I0217 09:39:35.468824 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f7q2k" Feb 17 09:39:35 crc kubenswrapper[4848]: I0217 09:39:35.641260 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f7q2k"] Feb 17 09:39:37 crc kubenswrapper[4848]: I0217 09:39:37.032333 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f7q2k" podUID="17314677-36a2-48b1-999e-5917bd3cc804" containerName="registry-server" containerID="cri-o://b7ff35be3251cc30457b06d1cb80c55066dc11e921c8d25045068816b42fad21" gracePeriod=2 Feb 17 09:39:37 crc kubenswrapper[4848]: I0217 09:39:37.497287 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f7q2k" Feb 17 09:39:37 crc kubenswrapper[4848]: I0217 09:39:37.605316 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zs9t\" (UniqueName: \"kubernetes.io/projected/17314677-36a2-48b1-999e-5917bd3cc804-kube-api-access-5zs9t\") pod \"17314677-36a2-48b1-999e-5917bd3cc804\" (UID: \"17314677-36a2-48b1-999e-5917bd3cc804\") " Feb 17 09:39:37 crc kubenswrapper[4848]: I0217 09:39:37.605613 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17314677-36a2-48b1-999e-5917bd3cc804-catalog-content\") pod \"17314677-36a2-48b1-999e-5917bd3cc804\" (UID: \"17314677-36a2-48b1-999e-5917bd3cc804\") " Feb 17 09:39:37 crc kubenswrapper[4848]: I0217 09:39:37.605802 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17314677-36a2-48b1-999e-5917bd3cc804-utilities\") pod \"17314677-36a2-48b1-999e-5917bd3cc804\" (UID: \"17314677-36a2-48b1-999e-5917bd3cc804\") " Feb 17 09:39:37 crc kubenswrapper[4848]: I0217 09:39:37.606729 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17314677-36a2-48b1-999e-5917bd3cc804-utilities" (OuterVolumeSpecName: "utilities") pod "17314677-36a2-48b1-999e-5917bd3cc804" (UID: "17314677-36a2-48b1-999e-5917bd3cc804"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:39:37 crc kubenswrapper[4848]: I0217 09:39:37.611428 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17314677-36a2-48b1-999e-5917bd3cc804-kube-api-access-5zs9t" (OuterVolumeSpecName: "kube-api-access-5zs9t") pod "17314677-36a2-48b1-999e-5917bd3cc804" (UID: "17314677-36a2-48b1-999e-5917bd3cc804"). InnerVolumeSpecName "kube-api-access-5zs9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:39:37 crc kubenswrapper[4848]: I0217 09:39:37.707793 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/17314677-36a2-48b1-999e-5917bd3cc804-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:39:37 crc kubenswrapper[4848]: I0217 09:39:37.707828 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zs9t\" (UniqueName: \"kubernetes.io/projected/17314677-36a2-48b1-999e-5917bd3cc804-kube-api-access-5zs9t\") on node \"crc\" DevicePath \"\"" Feb 17 09:39:37 crc kubenswrapper[4848]: I0217 09:39:37.754620 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17314677-36a2-48b1-999e-5917bd3cc804-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "17314677-36a2-48b1-999e-5917bd3cc804" (UID: "17314677-36a2-48b1-999e-5917bd3cc804"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:39:37 crc kubenswrapper[4848]: I0217 09:39:37.812628 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/17314677-36a2-48b1-999e-5917bd3cc804-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:39:38 crc kubenswrapper[4848]: I0217 09:39:38.045589 4848 generic.go:334] "Generic (PLEG): container finished" podID="17314677-36a2-48b1-999e-5917bd3cc804" containerID="b7ff35be3251cc30457b06d1cb80c55066dc11e921c8d25045068816b42fad21" exitCode=0 Feb 17 09:39:38 crc kubenswrapper[4848]: I0217 09:39:38.045645 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7q2k" event={"ID":"17314677-36a2-48b1-999e-5917bd3cc804","Type":"ContainerDied","Data":"b7ff35be3251cc30457b06d1cb80c55066dc11e921c8d25045068816b42fad21"} Feb 17 09:39:38 crc kubenswrapper[4848]: I0217 09:39:38.045677 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f7q2k" event={"ID":"17314677-36a2-48b1-999e-5917bd3cc804","Type":"ContainerDied","Data":"650b39571bbac0b1b7a6ad0180e9a3e7b50f4a8350a8c5c29e3a7bbf08d9d08a"} Feb 17 09:39:38 crc kubenswrapper[4848]: I0217 09:39:38.045724 4848 scope.go:117] "RemoveContainer" containerID="b7ff35be3251cc30457b06d1cb80c55066dc11e921c8d25045068816b42fad21" Feb 17 09:39:38 crc kubenswrapper[4848]: I0217 09:39:38.045734 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f7q2k" Feb 17 09:39:38 crc kubenswrapper[4848]: I0217 09:39:38.080074 4848 scope.go:117] "RemoveContainer" containerID="d07cd86647e562d0eb1f13ffa24a90de3623394a6caa5cd1147eca9f4f031081" Feb 17 09:39:38 crc kubenswrapper[4848]: I0217 09:39:38.110955 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f7q2k"] Feb 17 09:39:38 crc kubenswrapper[4848]: I0217 09:39:38.127656 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f7q2k"] Feb 17 09:39:38 crc kubenswrapper[4848]: I0217 09:39:38.136018 4848 scope.go:117] "RemoveContainer" containerID="bddefa8dbf0fe3694b262532aedc7a02fc7ce796b9eb05db096e471315aebd55" Feb 17 09:39:38 crc kubenswrapper[4848]: I0217 09:39:38.180243 4848 scope.go:117] "RemoveContainer" containerID="b7ff35be3251cc30457b06d1cb80c55066dc11e921c8d25045068816b42fad21" Feb 17 09:39:38 crc kubenswrapper[4848]: E0217 09:39:38.180833 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7ff35be3251cc30457b06d1cb80c55066dc11e921c8d25045068816b42fad21\": container with ID starting with b7ff35be3251cc30457b06d1cb80c55066dc11e921c8d25045068816b42fad21 not found: ID does not exist" containerID="b7ff35be3251cc30457b06d1cb80c55066dc11e921c8d25045068816b42fad21" Feb 17 09:39:38 crc kubenswrapper[4848]: I0217 09:39:38.180884 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7ff35be3251cc30457b06d1cb80c55066dc11e921c8d25045068816b42fad21"} err="failed to get container status \"b7ff35be3251cc30457b06d1cb80c55066dc11e921c8d25045068816b42fad21\": rpc error: code = NotFound desc = could not find container \"b7ff35be3251cc30457b06d1cb80c55066dc11e921c8d25045068816b42fad21\": container with ID starting with b7ff35be3251cc30457b06d1cb80c55066dc11e921c8d25045068816b42fad21 not found: ID does not exist" Feb 17 09:39:38 crc kubenswrapper[4848]: I0217 09:39:38.180913 4848 scope.go:117] "RemoveContainer" containerID="d07cd86647e562d0eb1f13ffa24a90de3623394a6caa5cd1147eca9f4f031081" Feb 17 09:39:38 crc kubenswrapper[4848]: E0217 09:39:38.181309 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d07cd86647e562d0eb1f13ffa24a90de3623394a6caa5cd1147eca9f4f031081\": container with ID starting with d07cd86647e562d0eb1f13ffa24a90de3623394a6caa5cd1147eca9f4f031081 not found: ID does not exist" containerID="d07cd86647e562d0eb1f13ffa24a90de3623394a6caa5cd1147eca9f4f031081" Feb 17 09:39:38 crc kubenswrapper[4848]: I0217 09:39:38.181339 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d07cd86647e562d0eb1f13ffa24a90de3623394a6caa5cd1147eca9f4f031081"} err="failed to get container status \"d07cd86647e562d0eb1f13ffa24a90de3623394a6caa5cd1147eca9f4f031081\": rpc error: code = NotFound desc = could not find container \"d07cd86647e562d0eb1f13ffa24a90de3623394a6caa5cd1147eca9f4f031081\": container with ID starting with d07cd86647e562d0eb1f13ffa24a90de3623394a6caa5cd1147eca9f4f031081 not found: ID does not exist" Feb 17 09:39:38 crc kubenswrapper[4848]: I0217 09:39:38.181359 4848 scope.go:117] "RemoveContainer" containerID="bddefa8dbf0fe3694b262532aedc7a02fc7ce796b9eb05db096e471315aebd55" Feb 17 09:39:38 crc kubenswrapper[4848]: E0217 09:39:38.181732 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bddefa8dbf0fe3694b262532aedc7a02fc7ce796b9eb05db096e471315aebd55\": container with ID starting with bddefa8dbf0fe3694b262532aedc7a02fc7ce796b9eb05db096e471315aebd55 not found: ID does not exist" containerID="bddefa8dbf0fe3694b262532aedc7a02fc7ce796b9eb05db096e471315aebd55" Feb 17 09:39:38 crc kubenswrapper[4848]: I0217 09:39:38.181781 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bddefa8dbf0fe3694b262532aedc7a02fc7ce796b9eb05db096e471315aebd55"} err="failed to get container status \"bddefa8dbf0fe3694b262532aedc7a02fc7ce796b9eb05db096e471315aebd55\": rpc error: code = NotFound desc = could not find container \"bddefa8dbf0fe3694b262532aedc7a02fc7ce796b9eb05db096e471315aebd55\": container with ID starting with bddefa8dbf0fe3694b262532aedc7a02fc7ce796b9eb05db096e471315aebd55 not found: ID does not exist" Feb 17 09:39:39 crc kubenswrapper[4848]: I0217 09:39:39.393641 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17314677-36a2-48b1-999e-5917bd3cc804" path="/var/lib/kubelet/pods/17314677-36a2-48b1-999e-5917bd3cc804/volumes" Feb 17 09:40:16 crc kubenswrapper[4848]: I0217 09:40:16.399175 4848 generic.go:334] "Generic (PLEG): container finished" podID="a694769e-5bc0-4596-945c-2de9823168f0" containerID="30fdba1047782a9005f70e10ed50d20a81beb7402903dfd25a72ee179f280583" exitCode=0 Feb 17 09:40:16 crc kubenswrapper[4848]: I0217 09:40:16.399231 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk" event={"ID":"a694769e-5bc0-4596-945c-2de9823168f0","Type":"ContainerDied","Data":"30fdba1047782a9005f70e10ed50d20a81beb7402903dfd25a72ee179f280583"} Feb 17 09:40:17 crc kubenswrapper[4848]: I0217 09:40:17.992905 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.031297 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6vwl\" (UniqueName: \"kubernetes.io/projected/a694769e-5bc0-4596-945c-2de9823168f0-kube-api-access-q6vwl\") pod \"a694769e-5bc0-4596-945c-2de9823168f0\" (UID: \"a694769e-5bc0-4596-945c-2de9823168f0\") " Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.031363 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a694769e-5bc0-4596-945c-2de9823168f0-ssh-key-openstack-edpm-ipam\") pod \"a694769e-5bc0-4596-945c-2de9823168f0\" (UID: \"a694769e-5bc0-4596-945c-2de9823168f0\") " Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.031469 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a694769e-5bc0-4596-945c-2de9823168f0-libvirt-combined-ca-bundle\") pod \"a694769e-5bc0-4596-945c-2de9823168f0\" (UID: \"a694769e-5bc0-4596-945c-2de9823168f0\") " Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.031560 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a694769e-5bc0-4596-945c-2de9823168f0-libvirt-secret-0\") pod \"a694769e-5bc0-4596-945c-2de9823168f0\" (UID: \"a694769e-5bc0-4596-945c-2de9823168f0\") " Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.031611 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a694769e-5bc0-4596-945c-2de9823168f0-inventory\") pod \"a694769e-5bc0-4596-945c-2de9823168f0\" (UID: \"a694769e-5bc0-4596-945c-2de9823168f0\") " Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.039075 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a694769e-5bc0-4596-945c-2de9823168f0-kube-api-access-q6vwl" (OuterVolumeSpecName: "kube-api-access-q6vwl") pod "a694769e-5bc0-4596-945c-2de9823168f0" (UID: "a694769e-5bc0-4596-945c-2de9823168f0"). InnerVolumeSpecName "kube-api-access-q6vwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.045884 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a694769e-5bc0-4596-945c-2de9823168f0-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "a694769e-5bc0-4596-945c-2de9823168f0" (UID: "a694769e-5bc0-4596-945c-2de9823168f0"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.067406 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a694769e-5bc0-4596-945c-2de9823168f0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a694769e-5bc0-4596-945c-2de9823168f0" (UID: "a694769e-5bc0-4596-945c-2de9823168f0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.082103 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a694769e-5bc0-4596-945c-2de9823168f0-inventory" (OuterVolumeSpecName: "inventory") pod "a694769e-5bc0-4596-945c-2de9823168f0" (UID: "a694769e-5bc0-4596-945c-2de9823168f0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.085111 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a694769e-5bc0-4596-945c-2de9823168f0-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "a694769e-5bc0-4596-945c-2de9823168f0" (UID: "a694769e-5bc0-4596-945c-2de9823168f0"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.134882 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6vwl\" (UniqueName: \"kubernetes.io/projected/a694769e-5bc0-4596-945c-2de9823168f0-kube-api-access-q6vwl\") on node \"crc\" DevicePath \"\"" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.134939 4848 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a694769e-5bc0-4596-945c-2de9823168f0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.134959 4848 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a694769e-5bc0-4596-945c-2de9823168f0-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.134978 4848 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/a694769e-5bc0-4596-945c-2de9823168f0-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.135089 4848 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a694769e-5bc0-4596-945c-2de9823168f0-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.434281 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk" event={"ID":"a694769e-5bc0-4596-945c-2de9823168f0","Type":"ContainerDied","Data":"c0b387d0d330e65976ccc577e381d11fa8f88b7fdf165b690a2de74cf74defa6"} Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.434585 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0b387d0d330e65976ccc577e381d11fa8f88b7fdf165b690a2de74cf74defa6" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.434355 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.651614 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf"] Feb 17 09:40:18 crc kubenswrapper[4848]: E0217 09:40:18.652327 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a694769e-5bc0-4596-945c-2de9823168f0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.652376 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="a694769e-5bc0-4596-945c-2de9823168f0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 17 09:40:18 crc kubenswrapper[4848]: E0217 09:40:18.652410 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4be5ad7-b624-46ba-8dc2-eef530b47800" containerName="registry-server" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.652427 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4be5ad7-b624-46ba-8dc2-eef530b47800" containerName="registry-server" Feb 17 09:40:18 crc kubenswrapper[4848]: E0217 09:40:18.652450 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17314677-36a2-48b1-999e-5917bd3cc804" containerName="extract-content" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.652462 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="17314677-36a2-48b1-999e-5917bd3cc804" containerName="extract-content" Feb 17 09:40:18 crc kubenswrapper[4848]: E0217 09:40:18.652493 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17314677-36a2-48b1-999e-5917bd3cc804" containerName="extract-utilities" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.652507 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="17314677-36a2-48b1-999e-5917bd3cc804" containerName="extract-utilities" Feb 17 09:40:18 crc kubenswrapper[4848]: E0217 09:40:18.652536 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4be5ad7-b624-46ba-8dc2-eef530b47800" containerName="extract-content" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.652549 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4be5ad7-b624-46ba-8dc2-eef530b47800" containerName="extract-content" Feb 17 09:40:18 crc kubenswrapper[4848]: E0217 09:40:18.652566 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17314677-36a2-48b1-999e-5917bd3cc804" containerName="registry-server" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.652578 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="17314677-36a2-48b1-999e-5917bd3cc804" containerName="registry-server" Feb 17 09:40:18 crc kubenswrapper[4848]: E0217 09:40:18.652607 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4be5ad7-b624-46ba-8dc2-eef530b47800" containerName="extract-utilities" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.652620 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4be5ad7-b624-46ba-8dc2-eef530b47800" containerName="extract-utilities" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.652982 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="17314677-36a2-48b1-999e-5917bd3cc804" containerName="registry-server" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.653037 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="a694769e-5bc0-4596-945c-2de9823168f0" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.653059 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4be5ad7-b624-46ba-8dc2-eef530b47800" containerName="registry-server" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.654204 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.656663 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.657060 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.657752 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.658283 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.658637 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-df2nq" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.659375 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.662962 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.672144 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf"] Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.746599 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.746663 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.746703 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.746722 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.746821 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.746863 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.746891 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.746907 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f8s2\" (UniqueName: \"kubernetes.io/projected/8b957c61-bd51-415a-9d34-da20cb8ebd55-kube-api-access-7f8s2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.746964 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.747064 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.747100 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.849908 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.850086 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.850195 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.850264 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.850345 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.850401 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.850558 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.850670 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.850750 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.850847 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7f8s2\" (UniqueName: \"kubernetes.io/projected/8b957c61-bd51-415a-9d34-da20cb8ebd55-kube-api-access-7f8s2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.850940 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.854183 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.856713 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.858718 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.859490 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.859594 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.859785 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.859904 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.861167 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.866832 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.866995 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.868468 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f8s2\" (UniqueName: \"kubernetes.io/projected/8b957c61-bd51-415a-9d34-da20cb8ebd55-kube-api-access-7f8s2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-jnqsf\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:18 crc kubenswrapper[4848]: I0217 09:40:18.981094 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:40:19 crc kubenswrapper[4848]: I0217 09:40:19.596308 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf"] Feb 17 09:40:20 crc kubenswrapper[4848]: I0217 09:40:20.451629 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" event={"ID":"8b957c61-bd51-415a-9d34-da20cb8ebd55","Type":"ContainerStarted","Data":"d07346a22f5821c8aaa8ce7a34f6094137b16222b52618203799f61b798e8403"} Feb 17 09:40:20 crc kubenswrapper[4848]: I0217 09:40:20.451984 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" event={"ID":"8b957c61-bd51-415a-9d34-da20cb8ebd55","Type":"ContainerStarted","Data":"d4e89241a84c3f6d3fd0620c0141737a195fbc0ca2d28c7d678ecd03624f0339"} Feb 17 09:40:20 crc kubenswrapper[4848]: I0217 09:40:20.477562 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" podStartSLOduration=2.001208499 podStartE2EDuration="2.477545414s" podCreationTimestamp="2026-02-17 09:40:18 +0000 UTC" firstStartedPulling="2026-02-17 09:40:19.604102366 +0000 UTC m=+2097.147358022" lastFinishedPulling="2026-02-17 09:40:20.080439281 +0000 UTC m=+2097.623694937" observedRunningTime="2026-02-17 09:40:20.474719593 +0000 UTC m=+2098.017975249" watchObservedRunningTime="2026-02-17 09:40:20.477545414 +0000 UTC m=+2098.020801060" Feb 17 09:40:48 crc kubenswrapper[4848]: I0217 09:40:48.772266 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:40:48 crc kubenswrapper[4848]: I0217 09:40:48.772940 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:41:11 crc kubenswrapper[4848]: I0217 09:41:11.359315 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xrgpk"] Feb 17 09:41:11 crc kubenswrapper[4848]: I0217 09:41:11.368103 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xrgpk" Feb 17 09:41:11 crc kubenswrapper[4848]: I0217 09:41:11.378024 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xrgpk"] Feb 17 09:41:11 crc kubenswrapper[4848]: I0217 09:41:11.474207 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8-catalog-content\") pod \"community-operators-xrgpk\" (UID: \"26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8\") " pod="openshift-marketplace/community-operators-xrgpk" Feb 17 09:41:11 crc kubenswrapper[4848]: I0217 09:41:11.474550 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsgvb\" (UniqueName: \"kubernetes.io/projected/26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8-kube-api-access-tsgvb\") pod \"community-operators-xrgpk\" (UID: \"26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8\") " pod="openshift-marketplace/community-operators-xrgpk" Feb 17 09:41:11 crc kubenswrapper[4848]: I0217 09:41:11.474735 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8-utilities\") pod \"community-operators-xrgpk\" (UID: \"26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8\") " pod="openshift-marketplace/community-operators-xrgpk" Feb 17 09:41:11 crc kubenswrapper[4848]: I0217 09:41:11.575982 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8-catalog-content\") pod \"community-operators-xrgpk\" (UID: \"26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8\") " pod="openshift-marketplace/community-operators-xrgpk" Feb 17 09:41:11 crc kubenswrapper[4848]: I0217 09:41:11.576103 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsgvb\" (UniqueName: \"kubernetes.io/projected/26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8-kube-api-access-tsgvb\") pod \"community-operators-xrgpk\" (UID: \"26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8\") " pod="openshift-marketplace/community-operators-xrgpk" Feb 17 09:41:11 crc kubenswrapper[4848]: I0217 09:41:11.576159 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8-utilities\") pod \"community-operators-xrgpk\" (UID: \"26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8\") " pod="openshift-marketplace/community-operators-xrgpk" Feb 17 09:41:11 crc kubenswrapper[4848]: I0217 09:41:11.576634 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8-utilities\") pod \"community-operators-xrgpk\" (UID: \"26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8\") " pod="openshift-marketplace/community-operators-xrgpk" Feb 17 09:41:11 crc kubenswrapper[4848]: I0217 09:41:11.577265 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8-catalog-content\") pod \"community-operators-xrgpk\" (UID: \"26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8\") " pod="openshift-marketplace/community-operators-xrgpk" Feb 17 09:41:11 crc kubenswrapper[4848]: I0217 09:41:11.612151 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsgvb\" (UniqueName: \"kubernetes.io/projected/26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8-kube-api-access-tsgvb\") pod \"community-operators-xrgpk\" (UID: \"26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8\") " pod="openshift-marketplace/community-operators-xrgpk" Feb 17 09:41:11 crc kubenswrapper[4848]: I0217 09:41:11.718841 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xrgpk" Feb 17 09:41:12 crc kubenswrapper[4848]: I0217 09:41:12.212723 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xrgpk"] Feb 17 09:41:13 crc kubenswrapper[4848]: I0217 09:41:13.055227 4848 generic.go:334] "Generic (PLEG): container finished" podID="26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8" containerID="03297e42e29b0124f64e0a73d3a1b4d3ce5b536527f7c175d35cc2179907dba2" exitCode=0 Feb 17 09:41:13 crc kubenswrapper[4848]: I0217 09:41:13.056595 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrgpk" event={"ID":"26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8","Type":"ContainerDied","Data":"03297e42e29b0124f64e0a73d3a1b4d3ce5b536527f7c175d35cc2179907dba2"} Feb 17 09:41:13 crc kubenswrapper[4848]: I0217 09:41:13.056675 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrgpk" event={"ID":"26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8","Type":"ContainerStarted","Data":"43128dfa6b1cb0d520c4ed7f4709ed2ed88d9d6232c67ed7a2bc1c8d74656809"} Feb 17 09:41:15 crc kubenswrapper[4848]: I0217 09:41:15.073399 4848 generic.go:334] "Generic (PLEG): container finished" podID="26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8" containerID="e8c85f60e9631638247d4e4dd740f504ae17648ddddeb32a7769616699c0321f" exitCode=0 Feb 17 09:41:15 crc kubenswrapper[4848]: I0217 09:41:15.073476 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrgpk" event={"ID":"26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8","Type":"ContainerDied","Data":"e8c85f60e9631638247d4e4dd740f504ae17648ddddeb32a7769616699c0321f"} Feb 17 09:41:16 crc kubenswrapper[4848]: I0217 09:41:16.086873 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrgpk" event={"ID":"26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8","Type":"ContainerStarted","Data":"1bb8fe59736a2e372130807ddc8ca27f257532aa223d58a0ce204aabe76e9682"} Feb 17 09:41:16 crc kubenswrapper[4848]: I0217 09:41:16.115338 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xrgpk" podStartSLOduration=2.696673825 podStartE2EDuration="5.115311211s" podCreationTimestamp="2026-02-17 09:41:11 +0000 UTC" firstStartedPulling="2026-02-17 09:41:13.058741469 +0000 UTC m=+2150.601997125" lastFinishedPulling="2026-02-17 09:41:15.477378835 +0000 UTC m=+2153.020634511" observedRunningTime="2026-02-17 09:41:16.102415753 +0000 UTC m=+2153.645671439" watchObservedRunningTime="2026-02-17 09:41:16.115311211 +0000 UTC m=+2153.658566897" Feb 17 09:41:18 crc kubenswrapper[4848]: I0217 09:41:18.771793 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:41:18 crc kubenswrapper[4848]: I0217 09:41:18.772188 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:41:21 crc kubenswrapper[4848]: I0217 09:41:21.719991 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xrgpk" Feb 17 09:41:21 crc kubenswrapper[4848]: I0217 09:41:21.720631 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xrgpk" Feb 17 09:41:21 crc kubenswrapper[4848]: I0217 09:41:21.775803 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xrgpk" Feb 17 09:41:21 crc kubenswrapper[4848]: I0217 09:41:21.990598 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xrgpk" Feb 17 09:41:22 crc kubenswrapper[4848]: I0217 09:41:22.052428 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xrgpk"] Feb 17 09:41:23 crc kubenswrapper[4848]: I0217 09:41:23.955624 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xrgpk" podUID="26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8" containerName="registry-server" containerID="cri-o://1bb8fe59736a2e372130807ddc8ca27f257532aa223d58a0ce204aabe76e9682" gracePeriod=2 Feb 17 09:41:24 crc kubenswrapper[4848]: I0217 09:41:24.470349 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xrgpk" Feb 17 09:41:24 crc kubenswrapper[4848]: I0217 09:41:24.651075 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8-utilities\") pod \"26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8\" (UID: \"26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8\") " Feb 17 09:41:24 crc kubenswrapper[4848]: I0217 09:41:24.651286 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8-catalog-content\") pod \"26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8\" (UID: \"26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8\") " Feb 17 09:41:24 crc kubenswrapper[4848]: I0217 09:41:24.651469 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsgvb\" (UniqueName: \"kubernetes.io/projected/26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8-kube-api-access-tsgvb\") pod \"26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8\" (UID: \"26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8\") " Feb 17 09:41:24 crc kubenswrapper[4848]: I0217 09:41:24.652195 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8-utilities" (OuterVolumeSpecName: "utilities") pod "26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8" (UID: "26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:41:24 crc kubenswrapper[4848]: I0217 09:41:24.658048 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8-kube-api-access-tsgvb" (OuterVolumeSpecName: "kube-api-access-tsgvb") pod "26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8" (UID: "26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8"). InnerVolumeSpecName "kube-api-access-tsgvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:41:24 crc kubenswrapper[4848]: I0217 09:41:24.753414 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsgvb\" (UniqueName: \"kubernetes.io/projected/26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8-kube-api-access-tsgvb\") on node \"crc\" DevicePath \"\"" Feb 17 09:41:24 crc kubenswrapper[4848]: I0217 09:41:24.753682 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:41:24 crc kubenswrapper[4848]: I0217 09:41:24.966199 4848 generic.go:334] "Generic (PLEG): container finished" podID="26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8" containerID="1bb8fe59736a2e372130807ddc8ca27f257532aa223d58a0ce204aabe76e9682" exitCode=0 Feb 17 09:41:24 crc kubenswrapper[4848]: I0217 09:41:24.966250 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrgpk" event={"ID":"26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8","Type":"ContainerDied","Data":"1bb8fe59736a2e372130807ddc8ca27f257532aa223d58a0ce204aabe76e9682"} Feb 17 09:41:24 crc kubenswrapper[4848]: I0217 09:41:24.966288 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xrgpk" event={"ID":"26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8","Type":"ContainerDied","Data":"43128dfa6b1cb0d520c4ed7f4709ed2ed88d9d6232c67ed7a2bc1c8d74656809"} Feb 17 09:41:24 crc kubenswrapper[4848]: I0217 09:41:24.966315 4848 scope.go:117] "RemoveContainer" containerID="1bb8fe59736a2e372130807ddc8ca27f257532aa223d58a0ce204aabe76e9682" Feb 17 09:41:24 crc kubenswrapper[4848]: I0217 09:41:24.966344 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xrgpk" Feb 17 09:41:25 crc kubenswrapper[4848]: I0217 09:41:25.002004 4848 scope.go:117] "RemoveContainer" containerID="e8c85f60e9631638247d4e4dd740f504ae17648ddddeb32a7769616699c0321f" Feb 17 09:41:25 crc kubenswrapper[4848]: I0217 09:41:25.032018 4848 scope.go:117] "RemoveContainer" containerID="03297e42e29b0124f64e0a73d3a1b4d3ce5b536527f7c175d35cc2179907dba2" Feb 17 09:41:25 crc kubenswrapper[4848]: I0217 09:41:25.106075 4848 scope.go:117] "RemoveContainer" containerID="1bb8fe59736a2e372130807ddc8ca27f257532aa223d58a0ce204aabe76e9682" Feb 17 09:41:25 crc kubenswrapper[4848]: E0217 09:41:25.106626 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bb8fe59736a2e372130807ddc8ca27f257532aa223d58a0ce204aabe76e9682\": container with ID starting with 1bb8fe59736a2e372130807ddc8ca27f257532aa223d58a0ce204aabe76e9682 not found: ID does not exist" containerID="1bb8fe59736a2e372130807ddc8ca27f257532aa223d58a0ce204aabe76e9682" Feb 17 09:41:25 crc kubenswrapper[4848]: I0217 09:41:25.106680 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bb8fe59736a2e372130807ddc8ca27f257532aa223d58a0ce204aabe76e9682"} err="failed to get container status \"1bb8fe59736a2e372130807ddc8ca27f257532aa223d58a0ce204aabe76e9682\": rpc error: code = NotFound desc = could not find container \"1bb8fe59736a2e372130807ddc8ca27f257532aa223d58a0ce204aabe76e9682\": container with ID starting with 1bb8fe59736a2e372130807ddc8ca27f257532aa223d58a0ce204aabe76e9682 not found: ID does not exist" Feb 17 09:41:25 crc kubenswrapper[4848]: I0217 09:41:25.106718 4848 scope.go:117] "RemoveContainer" containerID="e8c85f60e9631638247d4e4dd740f504ae17648ddddeb32a7769616699c0321f" Feb 17 09:41:25 crc kubenswrapper[4848]: E0217 09:41:25.107092 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8c85f60e9631638247d4e4dd740f504ae17648ddddeb32a7769616699c0321f\": container with ID starting with e8c85f60e9631638247d4e4dd740f504ae17648ddddeb32a7769616699c0321f not found: ID does not exist" containerID="e8c85f60e9631638247d4e4dd740f504ae17648ddddeb32a7769616699c0321f" Feb 17 09:41:25 crc kubenswrapper[4848]: I0217 09:41:25.107130 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8c85f60e9631638247d4e4dd740f504ae17648ddddeb32a7769616699c0321f"} err="failed to get container status \"e8c85f60e9631638247d4e4dd740f504ae17648ddddeb32a7769616699c0321f\": rpc error: code = NotFound desc = could not find container \"e8c85f60e9631638247d4e4dd740f504ae17648ddddeb32a7769616699c0321f\": container with ID starting with e8c85f60e9631638247d4e4dd740f504ae17648ddddeb32a7769616699c0321f not found: ID does not exist" Feb 17 09:41:25 crc kubenswrapper[4848]: I0217 09:41:25.107164 4848 scope.go:117] "RemoveContainer" containerID="03297e42e29b0124f64e0a73d3a1b4d3ce5b536527f7c175d35cc2179907dba2" Feb 17 09:41:25 crc kubenswrapper[4848]: E0217 09:41:25.107423 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03297e42e29b0124f64e0a73d3a1b4d3ce5b536527f7c175d35cc2179907dba2\": container with ID starting with 03297e42e29b0124f64e0a73d3a1b4d3ce5b536527f7c175d35cc2179907dba2 not found: ID does not exist" containerID="03297e42e29b0124f64e0a73d3a1b4d3ce5b536527f7c175d35cc2179907dba2" Feb 17 09:41:25 crc kubenswrapper[4848]: I0217 09:41:25.107451 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03297e42e29b0124f64e0a73d3a1b4d3ce5b536527f7c175d35cc2179907dba2"} err="failed to get container status \"03297e42e29b0124f64e0a73d3a1b4d3ce5b536527f7c175d35cc2179907dba2\": rpc error: code = NotFound desc = could not find container \"03297e42e29b0124f64e0a73d3a1b4d3ce5b536527f7c175d35cc2179907dba2\": container with ID starting with 03297e42e29b0124f64e0a73d3a1b4d3ce5b536527f7c175d35cc2179907dba2 not found: ID does not exist" Feb 17 09:41:25 crc kubenswrapper[4848]: I0217 09:41:25.263179 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8" (UID: "26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:41:25 crc kubenswrapper[4848]: I0217 09:41:25.324012 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xrgpk"] Feb 17 09:41:25 crc kubenswrapper[4848]: I0217 09:41:25.336350 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xrgpk"] Feb 17 09:41:25 crc kubenswrapper[4848]: I0217 09:41:25.364647 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:41:25 crc kubenswrapper[4848]: I0217 09:41:25.397748 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8" path="/var/lib/kubelet/pods/26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8/volumes" Feb 17 09:41:48 crc kubenswrapper[4848]: I0217 09:41:48.771465 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:41:48 crc kubenswrapper[4848]: I0217 09:41:48.771949 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:41:48 crc kubenswrapper[4848]: I0217 09:41:48.771996 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:41:48 crc kubenswrapper[4848]: I0217 09:41:48.772848 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7fd570aced053602ec77ca215312b917159892f88fee9de807e73bca6b19517b"} pod="openshift-machine-config-operator/machine-config-daemon-stvnz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 09:41:48 crc kubenswrapper[4848]: I0217 09:41:48.772911 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" containerID="cri-o://7fd570aced053602ec77ca215312b917159892f88fee9de807e73bca6b19517b" gracePeriod=600 Feb 17 09:41:49 crc kubenswrapper[4848]: I0217 09:41:49.204876 4848 generic.go:334] "Generic (PLEG): container finished" podID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerID="7fd570aced053602ec77ca215312b917159892f88fee9de807e73bca6b19517b" exitCode=0 Feb 17 09:41:49 crc kubenswrapper[4848]: I0217 09:41:49.204973 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerDied","Data":"7fd570aced053602ec77ca215312b917159892f88fee9de807e73bca6b19517b"} Feb 17 09:41:49 crc kubenswrapper[4848]: I0217 09:41:49.205280 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerStarted","Data":"b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29"} Feb 17 09:41:49 crc kubenswrapper[4848]: I0217 09:41:49.205303 4848 scope.go:117] "RemoveContainer" containerID="e242bb493f6e9696c8363f27791988fe2427ed2d17c5e2c3b3769be5bd98d521" Feb 17 09:42:36 crc kubenswrapper[4848]: I0217 09:42:36.745035 4848 generic.go:334] "Generic (PLEG): container finished" podID="8b957c61-bd51-415a-9d34-da20cb8ebd55" containerID="d07346a22f5821c8aaa8ce7a34f6094137b16222b52618203799f61b798e8403" exitCode=0 Feb 17 09:42:36 crc kubenswrapper[4848]: I0217 09:42:36.745124 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" event={"ID":"8b957c61-bd51-415a-9d34-da20cb8ebd55","Type":"ContainerDied","Data":"d07346a22f5821c8aaa8ce7a34f6094137b16222b52618203799f61b798e8403"} Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.210095 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.243372 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-migration-ssh-key-1\") pod \"8b957c61-bd51-415a-9d34-da20cb8ebd55\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.243419 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-migration-ssh-key-0\") pod \"8b957c61-bd51-415a-9d34-da20cb8ebd55\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.243440 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-cell1-compute-config-0\") pod \"8b957c61-bd51-415a-9d34-da20cb8ebd55\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.243484 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-cell1-compute-config-2\") pod \"8b957c61-bd51-415a-9d34-da20cb8ebd55\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.243507 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-ssh-key-openstack-edpm-ipam\") pod \"8b957c61-bd51-415a-9d34-da20cb8ebd55\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.278592 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8b957c61-bd51-415a-9d34-da20cb8ebd55" (UID: "8b957c61-bd51-415a-9d34-da20cb8ebd55"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.278694 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "8b957c61-bd51-415a-9d34-da20cb8ebd55" (UID: "8b957c61-bd51-415a-9d34-da20cb8ebd55"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.283353 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "8b957c61-bd51-415a-9d34-da20cb8ebd55" (UID: "8b957c61-bd51-415a-9d34-da20cb8ebd55"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.284138 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "8b957c61-bd51-415a-9d34-da20cb8ebd55" (UID: "8b957c61-bd51-415a-9d34-da20cb8ebd55"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.295700 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "8b957c61-bd51-415a-9d34-da20cb8ebd55" (UID: "8b957c61-bd51-415a-9d34-da20cb8ebd55"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.344960 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-cell1-compute-config-1\") pod \"8b957c61-bd51-415a-9d34-da20cb8ebd55\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.345054 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7f8s2\" (UniqueName: \"kubernetes.io/projected/8b957c61-bd51-415a-9d34-da20cb8ebd55-kube-api-access-7f8s2\") pod \"8b957c61-bd51-415a-9d34-da20cb8ebd55\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.345149 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-combined-ca-bundle\") pod \"8b957c61-bd51-415a-9d34-da20cb8ebd55\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.345203 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-inventory\") pod \"8b957c61-bd51-415a-9d34-da20cb8ebd55\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.345239 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-cell1-compute-config-3\") pod \"8b957c61-bd51-415a-9d34-da20cb8ebd55\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.345342 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-extra-config-0\") pod \"8b957c61-bd51-415a-9d34-da20cb8ebd55\" (UID: \"8b957c61-bd51-415a-9d34-da20cb8ebd55\") " Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.346466 4848 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.346506 4848 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.346519 4848 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.346530 4848 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.346540 4848 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.349358 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "8b957c61-bd51-415a-9d34-da20cb8ebd55" (UID: "8b957c61-bd51-415a-9d34-da20cb8ebd55"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.349713 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b957c61-bd51-415a-9d34-da20cb8ebd55-kube-api-access-7f8s2" (OuterVolumeSpecName: "kube-api-access-7f8s2") pod "8b957c61-bd51-415a-9d34-da20cb8ebd55" (UID: "8b957c61-bd51-415a-9d34-da20cb8ebd55"). InnerVolumeSpecName "kube-api-access-7f8s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.369821 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "8b957c61-bd51-415a-9d34-da20cb8ebd55" (UID: "8b957c61-bd51-415a-9d34-da20cb8ebd55"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.385302 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "8b957c61-bd51-415a-9d34-da20cb8ebd55" (UID: "8b957c61-bd51-415a-9d34-da20cb8ebd55"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.391995 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "8b957c61-bd51-415a-9d34-da20cb8ebd55" (UID: "8b957c61-bd51-415a-9d34-da20cb8ebd55"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.393030 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-inventory" (OuterVolumeSpecName: "inventory") pod "8b957c61-bd51-415a-9d34-da20cb8ebd55" (UID: "8b957c61-bd51-415a-9d34-da20cb8ebd55"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.449036 4848 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.449069 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7f8s2\" (UniqueName: \"kubernetes.io/projected/8b957c61-bd51-415a-9d34-da20cb8ebd55-kube-api-access-7f8s2\") on node \"crc\" DevicePath \"\"" Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.449079 4848 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.449090 4848 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.449099 4848 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.449108 4848 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/8b957c61-bd51-415a-9d34-da20cb8ebd55-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.773227 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" event={"ID":"8b957c61-bd51-415a-9d34-da20cb8ebd55","Type":"ContainerDied","Data":"d4e89241a84c3f6d3fd0620c0141737a195fbc0ca2d28c7d678ecd03624f0339"} Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.773286 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4e89241a84c3f6d3fd0620c0141737a195fbc0ca2d28c7d678ecd03624f0339" Feb 17 09:42:38 crc kubenswrapper[4848]: I0217 09:42:38.773303 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-jnqsf" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.061046 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455"] Feb 17 09:42:39 crc kubenswrapper[4848]: E0217 09:42:39.062071 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8" containerName="registry-server" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.062099 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8" containerName="registry-server" Feb 17 09:42:39 crc kubenswrapper[4848]: E0217 09:42:39.062139 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8" containerName="extract-utilities" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.062153 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8" containerName="extract-utilities" Feb 17 09:42:39 crc kubenswrapper[4848]: E0217 09:42:39.062172 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8" containerName="extract-content" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.062186 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8" containerName="extract-content" Feb 17 09:42:39 crc kubenswrapper[4848]: E0217 09:42:39.062205 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b957c61-bd51-415a-9d34-da20cb8ebd55" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.062218 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b957c61-bd51-415a-9d34-da20cb8ebd55" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.062529 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="26a08f5a-5efb-4756-aa8c-3eeb8b3da7e8" containerName="registry-server" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.062586 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b957c61-bd51-415a-9d34-da20cb8ebd55" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.063564 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.085164 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455"] Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.111253 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.111513 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-df2nq" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.111592 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.111700 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.113910 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.266264 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbrwp\" (UniqueName: \"kubernetes.io/projected/78d70d99-629a-4211-9ead-66a16b766326-kube-api-access-rbrwp\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4455\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.266444 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4455\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.266529 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4455\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.266566 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4455\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.266614 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4455\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.266831 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4455\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.266926 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4455\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.369415 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4455\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.369618 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4455\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.369719 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4455\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.369811 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbrwp\" (UniqueName: \"kubernetes.io/projected/78d70d99-629a-4211-9ead-66a16b766326-kube-api-access-rbrwp\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4455\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.369888 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4455\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.369951 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4455\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.369989 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4455\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.376083 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4455\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.376792 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4455\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.380189 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4455\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.381870 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4455\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.381948 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4455\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.382057 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4455\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.402334 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbrwp\" (UniqueName: \"kubernetes.io/projected/78d70d99-629a-4211-9ead-66a16b766326-kube-api-access-rbrwp\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-x4455\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" Feb 17 09:42:39 crc kubenswrapper[4848]: I0217 09:42:39.470201 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" Feb 17 09:42:40 crc kubenswrapper[4848]: I0217 09:42:40.125722 4848 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 09:42:40 crc kubenswrapper[4848]: I0217 09:42:40.142143 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455"] Feb 17 09:42:40 crc kubenswrapper[4848]: I0217 09:42:40.794825 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" event={"ID":"78d70d99-629a-4211-9ead-66a16b766326","Type":"ContainerStarted","Data":"058ddd475c360a9af201a9262443f2658bd25aa06d6127a2dbd33413642fc380"} Feb 17 09:42:41 crc kubenswrapper[4848]: I0217 09:42:41.812611 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" event={"ID":"78d70d99-629a-4211-9ead-66a16b766326","Type":"ContainerStarted","Data":"b7ef5605c97534a5ddac5803d05ffea34c43b2926e5c0c91838bf3772e47f0f3"} Feb 17 09:42:41 crc kubenswrapper[4848]: I0217 09:42:41.857092 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" podStartSLOduration=2.408136311 podStartE2EDuration="2.857061679s" podCreationTimestamp="2026-02-17 09:42:39 +0000 UTC" firstStartedPulling="2026-02-17 09:42:40.125248479 +0000 UTC m=+2237.668504145" lastFinishedPulling="2026-02-17 09:42:40.574173877 +0000 UTC m=+2238.117429513" observedRunningTime="2026-02-17 09:42:41.85009143 +0000 UTC m=+2239.393347086" watchObservedRunningTime="2026-02-17 09:42:41.857061679 +0000 UTC m=+2239.400317365" Feb 17 09:44:00 crc kubenswrapper[4848]: I0217 09:44:00.501739 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-76x4l"] Feb 17 09:44:00 crc kubenswrapper[4848]: I0217 09:44:00.505353 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76x4l" Feb 17 09:44:00 crc kubenswrapper[4848]: I0217 09:44:00.525190 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-76x4l"] Feb 17 09:44:00 crc kubenswrapper[4848]: I0217 09:44:00.598232 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfa54fed-3896-41e8-a291-503699247a5f-utilities\") pod \"redhat-marketplace-76x4l\" (UID: \"cfa54fed-3896-41e8-a291-503699247a5f\") " pod="openshift-marketplace/redhat-marketplace-76x4l" Feb 17 09:44:00 crc kubenswrapper[4848]: I0217 09:44:00.598334 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfa54fed-3896-41e8-a291-503699247a5f-catalog-content\") pod \"redhat-marketplace-76x4l\" (UID: \"cfa54fed-3896-41e8-a291-503699247a5f\") " pod="openshift-marketplace/redhat-marketplace-76x4l" Feb 17 09:44:00 crc kubenswrapper[4848]: I0217 09:44:00.598682 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t2hm\" (UniqueName: \"kubernetes.io/projected/cfa54fed-3896-41e8-a291-503699247a5f-kube-api-access-2t2hm\") pod \"redhat-marketplace-76x4l\" (UID: \"cfa54fed-3896-41e8-a291-503699247a5f\") " pod="openshift-marketplace/redhat-marketplace-76x4l" Feb 17 09:44:00 crc kubenswrapper[4848]: I0217 09:44:00.700136 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t2hm\" (UniqueName: \"kubernetes.io/projected/cfa54fed-3896-41e8-a291-503699247a5f-kube-api-access-2t2hm\") pod \"redhat-marketplace-76x4l\" (UID: \"cfa54fed-3896-41e8-a291-503699247a5f\") " pod="openshift-marketplace/redhat-marketplace-76x4l" Feb 17 09:44:00 crc kubenswrapper[4848]: I0217 09:44:00.700254 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfa54fed-3896-41e8-a291-503699247a5f-utilities\") pod \"redhat-marketplace-76x4l\" (UID: \"cfa54fed-3896-41e8-a291-503699247a5f\") " pod="openshift-marketplace/redhat-marketplace-76x4l" Feb 17 09:44:00 crc kubenswrapper[4848]: I0217 09:44:00.700306 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfa54fed-3896-41e8-a291-503699247a5f-catalog-content\") pod \"redhat-marketplace-76x4l\" (UID: \"cfa54fed-3896-41e8-a291-503699247a5f\") " pod="openshift-marketplace/redhat-marketplace-76x4l" Feb 17 09:44:00 crc kubenswrapper[4848]: I0217 09:44:00.700835 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfa54fed-3896-41e8-a291-503699247a5f-catalog-content\") pod \"redhat-marketplace-76x4l\" (UID: \"cfa54fed-3896-41e8-a291-503699247a5f\") " pod="openshift-marketplace/redhat-marketplace-76x4l" Feb 17 09:44:00 crc kubenswrapper[4848]: I0217 09:44:00.700935 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfa54fed-3896-41e8-a291-503699247a5f-utilities\") pod \"redhat-marketplace-76x4l\" (UID: \"cfa54fed-3896-41e8-a291-503699247a5f\") " pod="openshift-marketplace/redhat-marketplace-76x4l" Feb 17 09:44:00 crc kubenswrapper[4848]: I0217 09:44:00.723357 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t2hm\" (UniqueName: \"kubernetes.io/projected/cfa54fed-3896-41e8-a291-503699247a5f-kube-api-access-2t2hm\") pod \"redhat-marketplace-76x4l\" (UID: \"cfa54fed-3896-41e8-a291-503699247a5f\") " pod="openshift-marketplace/redhat-marketplace-76x4l" Feb 17 09:44:00 crc kubenswrapper[4848]: I0217 09:44:00.827663 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76x4l" Feb 17 09:44:01 crc kubenswrapper[4848]: I0217 09:44:01.288699 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-76x4l"] Feb 17 09:44:01 crc kubenswrapper[4848]: W0217 09:44:01.300739 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfa54fed_3896_41e8_a291_503699247a5f.slice/crio-d4b4141f056b9afc49cb7f7ee7a340ed627c7d04a170354c5e4377f8565acda8 WatchSource:0}: Error finding container d4b4141f056b9afc49cb7f7ee7a340ed627c7d04a170354c5e4377f8565acda8: Status 404 returned error can't find the container with id d4b4141f056b9afc49cb7f7ee7a340ed627c7d04a170354c5e4377f8565acda8 Feb 17 09:44:01 crc kubenswrapper[4848]: I0217 09:44:01.637954 4848 generic.go:334] "Generic (PLEG): container finished" podID="cfa54fed-3896-41e8-a291-503699247a5f" containerID="cc9dd0279685551da71b3ead66233f6a53775444fde594e15a9ed57277463fd0" exitCode=0 Feb 17 09:44:01 crc kubenswrapper[4848]: I0217 09:44:01.638004 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76x4l" event={"ID":"cfa54fed-3896-41e8-a291-503699247a5f","Type":"ContainerDied","Data":"cc9dd0279685551da71b3ead66233f6a53775444fde594e15a9ed57277463fd0"} Feb 17 09:44:01 crc kubenswrapper[4848]: I0217 09:44:01.638028 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76x4l" event={"ID":"cfa54fed-3896-41e8-a291-503699247a5f","Type":"ContainerStarted","Data":"d4b4141f056b9afc49cb7f7ee7a340ed627c7d04a170354c5e4377f8565acda8"} Feb 17 09:44:02 crc kubenswrapper[4848]: I0217 09:44:02.647572 4848 generic.go:334] "Generic (PLEG): container finished" podID="cfa54fed-3896-41e8-a291-503699247a5f" containerID="aeb36a6d75a37df747007bac39d556d94d50662f1a30f0b35506482233d716a6" exitCode=0 Feb 17 09:44:02 crc kubenswrapper[4848]: I0217 09:44:02.647855 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76x4l" event={"ID":"cfa54fed-3896-41e8-a291-503699247a5f","Type":"ContainerDied","Data":"aeb36a6d75a37df747007bac39d556d94d50662f1a30f0b35506482233d716a6"} Feb 17 09:44:03 crc kubenswrapper[4848]: I0217 09:44:03.658450 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76x4l" event={"ID":"cfa54fed-3896-41e8-a291-503699247a5f","Type":"ContainerStarted","Data":"cbd8240220ef5e4a01a2f320280a4bb99aa83de4f6751473bf07aa7695a98446"} Feb 17 09:44:03 crc kubenswrapper[4848]: I0217 09:44:03.681018 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-76x4l" podStartSLOduration=2.266420387 podStartE2EDuration="3.681000594s" podCreationTimestamp="2026-02-17 09:44:00 +0000 UTC" firstStartedPulling="2026-02-17 09:44:01.641136084 +0000 UTC m=+2319.184391730" lastFinishedPulling="2026-02-17 09:44:03.055716291 +0000 UTC m=+2320.598971937" observedRunningTime="2026-02-17 09:44:03.676079773 +0000 UTC m=+2321.219335469" watchObservedRunningTime="2026-02-17 09:44:03.681000594 +0000 UTC m=+2321.224256250" Feb 17 09:44:10 crc kubenswrapper[4848]: I0217 09:44:10.828430 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-76x4l" Feb 17 09:44:10 crc kubenswrapper[4848]: I0217 09:44:10.829128 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-76x4l" Feb 17 09:44:10 crc kubenswrapper[4848]: I0217 09:44:10.891449 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-76x4l" Feb 17 09:44:11 crc kubenswrapper[4848]: I0217 09:44:11.813173 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-76x4l" Feb 17 09:44:11 crc kubenswrapper[4848]: I0217 09:44:11.885842 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-76x4l"] Feb 17 09:44:13 crc kubenswrapper[4848]: I0217 09:44:13.747737 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-76x4l" podUID="cfa54fed-3896-41e8-a291-503699247a5f" containerName="registry-server" containerID="cri-o://cbd8240220ef5e4a01a2f320280a4bb99aa83de4f6751473bf07aa7695a98446" gracePeriod=2 Feb 17 09:44:14 crc kubenswrapper[4848]: I0217 09:44:14.239799 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76x4l" Feb 17 09:44:14 crc kubenswrapper[4848]: I0217 09:44:14.383679 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t2hm\" (UniqueName: \"kubernetes.io/projected/cfa54fed-3896-41e8-a291-503699247a5f-kube-api-access-2t2hm\") pod \"cfa54fed-3896-41e8-a291-503699247a5f\" (UID: \"cfa54fed-3896-41e8-a291-503699247a5f\") " Feb 17 09:44:14 crc kubenswrapper[4848]: I0217 09:44:14.383818 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfa54fed-3896-41e8-a291-503699247a5f-utilities\") pod \"cfa54fed-3896-41e8-a291-503699247a5f\" (UID: \"cfa54fed-3896-41e8-a291-503699247a5f\") " Feb 17 09:44:14 crc kubenswrapper[4848]: I0217 09:44:14.383976 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfa54fed-3896-41e8-a291-503699247a5f-catalog-content\") pod \"cfa54fed-3896-41e8-a291-503699247a5f\" (UID: \"cfa54fed-3896-41e8-a291-503699247a5f\") " Feb 17 09:44:14 crc kubenswrapper[4848]: I0217 09:44:14.384566 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfa54fed-3896-41e8-a291-503699247a5f-utilities" (OuterVolumeSpecName: "utilities") pod "cfa54fed-3896-41e8-a291-503699247a5f" (UID: "cfa54fed-3896-41e8-a291-503699247a5f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:44:14 crc kubenswrapper[4848]: I0217 09:44:14.396881 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfa54fed-3896-41e8-a291-503699247a5f-kube-api-access-2t2hm" (OuterVolumeSpecName: "kube-api-access-2t2hm") pod "cfa54fed-3896-41e8-a291-503699247a5f" (UID: "cfa54fed-3896-41e8-a291-503699247a5f"). InnerVolumeSpecName "kube-api-access-2t2hm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:44:14 crc kubenswrapper[4848]: I0217 09:44:14.417442 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfa54fed-3896-41e8-a291-503699247a5f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cfa54fed-3896-41e8-a291-503699247a5f" (UID: "cfa54fed-3896-41e8-a291-503699247a5f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:44:14 crc kubenswrapper[4848]: I0217 09:44:14.486020 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t2hm\" (UniqueName: \"kubernetes.io/projected/cfa54fed-3896-41e8-a291-503699247a5f-kube-api-access-2t2hm\") on node \"crc\" DevicePath \"\"" Feb 17 09:44:14 crc kubenswrapper[4848]: I0217 09:44:14.486058 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cfa54fed-3896-41e8-a291-503699247a5f-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:44:14 crc kubenswrapper[4848]: I0217 09:44:14.486068 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cfa54fed-3896-41e8-a291-503699247a5f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:44:14 crc kubenswrapper[4848]: I0217 09:44:14.759110 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-76x4l" Feb 17 09:44:14 crc kubenswrapper[4848]: I0217 09:44:14.759107 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76x4l" event={"ID":"cfa54fed-3896-41e8-a291-503699247a5f","Type":"ContainerDied","Data":"cbd8240220ef5e4a01a2f320280a4bb99aa83de4f6751473bf07aa7695a98446"} Feb 17 09:44:14 crc kubenswrapper[4848]: I0217 09:44:14.759006 4848 generic.go:334] "Generic (PLEG): container finished" podID="cfa54fed-3896-41e8-a291-503699247a5f" containerID="cbd8240220ef5e4a01a2f320280a4bb99aa83de4f6751473bf07aa7695a98446" exitCode=0 Feb 17 09:44:14 crc kubenswrapper[4848]: I0217 09:44:14.759651 4848 scope.go:117] "RemoveContainer" containerID="cbd8240220ef5e4a01a2f320280a4bb99aa83de4f6751473bf07aa7695a98446" Feb 17 09:44:14 crc kubenswrapper[4848]: I0217 09:44:14.759683 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-76x4l" event={"ID":"cfa54fed-3896-41e8-a291-503699247a5f","Type":"ContainerDied","Data":"d4b4141f056b9afc49cb7f7ee7a340ed627c7d04a170354c5e4377f8565acda8"} Feb 17 09:44:14 crc kubenswrapper[4848]: I0217 09:44:14.778403 4848 scope.go:117] "RemoveContainer" containerID="aeb36a6d75a37df747007bac39d556d94d50662f1a30f0b35506482233d716a6" Feb 17 09:44:14 crc kubenswrapper[4848]: I0217 09:44:14.801210 4848 scope.go:117] "RemoveContainer" containerID="cc9dd0279685551da71b3ead66233f6a53775444fde594e15a9ed57277463fd0" Feb 17 09:44:14 crc kubenswrapper[4848]: I0217 09:44:14.854271 4848 scope.go:117] "RemoveContainer" containerID="cbd8240220ef5e4a01a2f320280a4bb99aa83de4f6751473bf07aa7695a98446" Feb 17 09:44:14 crc kubenswrapper[4848]: E0217 09:44:14.854723 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbd8240220ef5e4a01a2f320280a4bb99aa83de4f6751473bf07aa7695a98446\": container with ID starting with cbd8240220ef5e4a01a2f320280a4bb99aa83de4f6751473bf07aa7695a98446 not found: ID does not exist" containerID="cbd8240220ef5e4a01a2f320280a4bb99aa83de4f6751473bf07aa7695a98446" Feb 17 09:44:14 crc kubenswrapper[4848]: I0217 09:44:14.854842 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbd8240220ef5e4a01a2f320280a4bb99aa83de4f6751473bf07aa7695a98446"} err="failed to get container status \"cbd8240220ef5e4a01a2f320280a4bb99aa83de4f6751473bf07aa7695a98446\": rpc error: code = NotFound desc = could not find container \"cbd8240220ef5e4a01a2f320280a4bb99aa83de4f6751473bf07aa7695a98446\": container with ID starting with cbd8240220ef5e4a01a2f320280a4bb99aa83de4f6751473bf07aa7695a98446 not found: ID does not exist" Feb 17 09:44:14 crc kubenswrapper[4848]: I0217 09:44:14.854888 4848 scope.go:117] "RemoveContainer" containerID="aeb36a6d75a37df747007bac39d556d94d50662f1a30f0b35506482233d716a6" Feb 17 09:44:14 crc kubenswrapper[4848]: E0217 09:44:14.855368 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeb36a6d75a37df747007bac39d556d94d50662f1a30f0b35506482233d716a6\": container with ID starting with aeb36a6d75a37df747007bac39d556d94d50662f1a30f0b35506482233d716a6 not found: ID does not exist" containerID="aeb36a6d75a37df747007bac39d556d94d50662f1a30f0b35506482233d716a6" Feb 17 09:44:14 crc kubenswrapper[4848]: I0217 09:44:14.855419 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeb36a6d75a37df747007bac39d556d94d50662f1a30f0b35506482233d716a6"} err="failed to get container status \"aeb36a6d75a37df747007bac39d556d94d50662f1a30f0b35506482233d716a6\": rpc error: code = NotFound desc = could not find container \"aeb36a6d75a37df747007bac39d556d94d50662f1a30f0b35506482233d716a6\": container with ID starting with aeb36a6d75a37df747007bac39d556d94d50662f1a30f0b35506482233d716a6 not found: ID does not exist" Feb 17 09:44:14 crc kubenswrapper[4848]: I0217 09:44:14.855451 4848 scope.go:117] "RemoveContainer" containerID="cc9dd0279685551da71b3ead66233f6a53775444fde594e15a9ed57277463fd0" Feb 17 09:44:14 crc kubenswrapper[4848]: E0217 09:44:14.855924 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc9dd0279685551da71b3ead66233f6a53775444fde594e15a9ed57277463fd0\": container with ID starting with cc9dd0279685551da71b3ead66233f6a53775444fde594e15a9ed57277463fd0 not found: ID does not exist" containerID="cc9dd0279685551da71b3ead66233f6a53775444fde594e15a9ed57277463fd0" Feb 17 09:44:14 crc kubenswrapper[4848]: I0217 09:44:14.855976 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc9dd0279685551da71b3ead66233f6a53775444fde594e15a9ed57277463fd0"} err="failed to get container status \"cc9dd0279685551da71b3ead66233f6a53775444fde594e15a9ed57277463fd0\": rpc error: code = NotFound desc = could not find container \"cc9dd0279685551da71b3ead66233f6a53775444fde594e15a9ed57277463fd0\": container with ID starting with cc9dd0279685551da71b3ead66233f6a53775444fde594e15a9ed57277463fd0 not found: ID does not exist" Feb 17 09:44:14 crc kubenswrapper[4848]: I0217 09:44:14.861901 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-76x4l"] Feb 17 09:44:14 crc kubenswrapper[4848]: I0217 09:44:14.873365 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-76x4l"] Feb 17 09:44:15 crc kubenswrapper[4848]: I0217 09:44:15.407280 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfa54fed-3896-41e8-a291-503699247a5f" path="/var/lib/kubelet/pods/cfa54fed-3896-41e8-a291-503699247a5f/volumes" Feb 17 09:44:18 crc kubenswrapper[4848]: I0217 09:44:18.771916 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:44:18 crc kubenswrapper[4848]: I0217 09:44:18.772741 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:44:48 crc kubenswrapper[4848]: I0217 09:44:48.772039 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:44:48 crc kubenswrapper[4848]: I0217 09:44:48.772545 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:45:00 crc kubenswrapper[4848]: I0217 09:45:00.157015 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522025-fqxsl"] Feb 17 09:45:00 crc kubenswrapper[4848]: E0217 09:45:00.158047 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa54fed-3896-41e8-a291-503699247a5f" containerName="registry-server" Feb 17 09:45:00 crc kubenswrapper[4848]: I0217 09:45:00.158067 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa54fed-3896-41e8-a291-503699247a5f" containerName="registry-server" Feb 17 09:45:00 crc kubenswrapper[4848]: E0217 09:45:00.158105 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa54fed-3896-41e8-a291-503699247a5f" containerName="extract-utilities" Feb 17 09:45:00 crc kubenswrapper[4848]: I0217 09:45:00.158114 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa54fed-3896-41e8-a291-503699247a5f" containerName="extract-utilities" Feb 17 09:45:00 crc kubenswrapper[4848]: E0217 09:45:00.158124 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa54fed-3896-41e8-a291-503699247a5f" containerName="extract-content" Feb 17 09:45:00 crc kubenswrapper[4848]: I0217 09:45:00.158131 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa54fed-3896-41e8-a291-503699247a5f" containerName="extract-content" Feb 17 09:45:00 crc kubenswrapper[4848]: I0217 09:45:00.158354 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfa54fed-3896-41e8-a291-503699247a5f" containerName="registry-server" Feb 17 09:45:00 crc kubenswrapper[4848]: I0217 09:45:00.159173 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522025-fqxsl" Feb 17 09:45:00 crc kubenswrapper[4848]: I0217 09:45:00.162940 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 09:45:00 crc kubenswrapper[4848]: I0217 09:45:00.163634 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 09:45:00 crc kubenswrapper[4848]: I0217 09:45:00.178417 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522025-fqxsl"] Feb 17 09:45:00 crc kubenswrapper[4848]: I0217 09:45:00.294740 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2d68e0e-8f51-456b-84e9-b8685a49adf7-secret-volume\") pod \"collect-profiles-29522025-fqxsl\" (UID: \"e2d68e0e-8f51-456b-84e9-b8685a49adf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522025-fqxsl" Feb 17 09:45:00 crc kubenswrapper[4848]: I0217 09:45:00.294853 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2d68e0e-8f51-456b-84e9-b8685a49adf7-config-volume\") pod \"collect-profiles-29522025-fqxsl\" (UID: \"e2d68e0e-8f51-456b-84e9-b8685a49adf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522025-fqxsl" Feb 17 09:45:00 crc kubenswrapper[4848]: I0217 09:45:00.294895 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dd9wc\" (UniqueName: \"kubernetes.io/projected/e2d68e0e-8f51-456b-84e9-b8685a49adf7-kube-api-access-dd9wc\") pod \"collect-profiles-29522025-fqxsl\" (UID: \"e2d68e0e-8f51-456b-84e9-b8685a49adf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522025-fqxsl" Feb 17 09:45:00 crc kubenswrapper[4848]: I0217 09:45:00.396835 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2d68e0e-8f51-456b-84e9-b8685a49adf7-secret-volume\") pod \"collect-profiles-29522025-fqxsl\" (UID: \"e2d68e0e-8f51-456b-84e9-b8685a49adf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522025-fqxsl" Feb 17 09:45:00 crc kubenswrapper[4848]: I0217 09:45:00.396951 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2d68e0e-8f51-456b-84e9-b8685a49adf7-config-volume\") pod \"collect-profiles-29522025-fqxsl\" (UID: \"e2d68e0e-8f51-456b-84e9-b8685a49adf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522025-fqxsl" Feb 17 09:45:00 crc kubenswrapper[4848]: I0217 09:45:00.396988 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dd9wc\" (UniqueName: \"kubernetes.io/projected/e2d68e0e-8f51-456b-84e9-b8685a49adf7-kube-api-access-dd9wc\") pod \"collect-profiles-29522025-fqxsl\" (UID: \"e2d68e0e-8f51-456b-84e9-b8685a49adf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522025-fqxsl" Feb 17 09:45:00 crc kubenswrapper[4848]: I0217 09:45:00.397945 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2d68e0e-8f51-456b-84e9-b8685a49adf7-config-volume\") pod \"collect-profiles-29522025-fqxsl\" (UID: \"e2d68e0e-8f51-456b-84e9-b8685a49adf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522025-fqxsl" Feb 17 09:45:00 crc kubenswrapper[4848]: I0217 09:45:00.406283 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2d68e0e-8f51-456b-84e9-b8685a49adf7-secret-volume\") pod \"collect-profiles-29522025-fqxsl\" (UID: \"e2d68e0e-8f51-456b-84e9-b8685a49adf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522025-fqxsl" Feb 17 09:45:00 crc kubenswrapper[4848]: I0217 09:45:00.428226 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dd9wc\" (UniqueName: \"kubernetes.io/projected/e2d68e0e-8f51-456b-84e9-b8685a49adf7-kube-api-access-dd9wc\") pod \"collect-profiles-29522025-fqxsl\" (UID: \"e2d68e0e-8f51-456b-84e9-b8685a49adf7\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522025-fqxsl" Feb 17 09:45:00 crc kubenswrapper[4848]: I0217 09:45:00.488189 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522025-fqxsl" Feb 17 09:45:00 crc kubenswrapper[4848]: I0217 09:45:00.934744 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522025-fqxsl"] Feb 17 09:45:01 crc kubenswrapper[4848]: I0217 09:45:01.310252 4848 generic.go:334] "Generic (PLEG): container finished" podID="78d70d99-629a-4211-9ead-66a16b766326" containerID="b7ef5605c97534a5ddac5803d05ffea34c43b2926e5c0c91838bf3772e47f0f3" exitCode=0 Feb 17 09:45:01 crc kubenswrapper[4848]: I0217 09:45:01.310317 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" event={"ID":"78d70d99-629a-4211-9ead-66a16b766326","Type":"ContainerDied","Data":"b7ef5605c97534a5ddac5803d05ffea34c43b2926e5c0c91838bf3772e47f0f3"} Feb 17 09:45:01 crc kubenswrapper[4848]: I0217 09:45:01.312071 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522025-fqxsl" event={"ID":"e2d68e0e-8f51-456b-84e9-b8685a49adf7","Type":"ContainerStarted","Data":"9b870a47ce0a55e6fbb7953f4d965c90dfd9cf3e8dda5b77ece4af66f84e5274"} Feb 17 09:45:01 crc kubenswrapper[4848]: I0217 09:45:01.312095 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522025-fqxsl" event={"ID":"e2d68e0e-8f51-456b-84e9-b8685a49adf7","Type":"ContainerStarted","Data":"eb840ec581bc571a32e25799c7fb6f2ef6ddf8ae03e795fa9599fa3d0d900ba0"} Feb 17 09:45:01 crc kubenswrapper[4848]: I0217 09:45:01.352225 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29522025-fqxsl" podStartSLOduration=1.352204887 podStartE2EDuration="1.352204887s" podCreationTimestamp="2026-02-17 09:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 09:45:01.347281856 +0000 UTC m=+2378.890537502" watchObservedRunningTime="2026-02-17 09:45:01.352204887 +0000 UTC m=+2378.895460553" Feb 17 09:45:02 crc kubenswrapper[4848]: I0217 09:45:02.328258 4848 generic.go:334] "Generic (PLEG): container finished" podID="e2d68e0e-8f51-456b-84e9-b8685a49adf7" containerID="9b870a47ce0a55e6fbb7953f4d965c90dfd9cf3e8dda5b77ece4af66f84e5274" exitCode=0 Feb 17 09:45:02 crc kubenswrapper[4848]: I0217 09:45:02.328376 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522025-fqxsl" event={"ID":"e2d68e0e-8f51-456b-84e9-b8685a49adf7","Type":"ContainerDied","Data":"9b870a47ce0a55e6fbb7953f4d965c90dfd9cf3e8dda5b77ece4af66f84e5274"} Feb 17 09:45:02 crc kubenswrapper[4848]: I0217 09:45:02.759649 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" Feb 17 09:45:02 crc kubenswrapper[4848]: I0217 09:45:02.848509 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-inventory\") pod \"78d70d99-629a-4211-9ead-66a16b766326\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " Feb 17 09:45:02 crc kubenswrapper[4848]: I0217 09:45:02.848562 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbrwp\" (UniqueName: \"kubernetes.io/projected/78d70d99-629a-4211-9ead-66a16b766326-kube-api-access-rbrwp\") pod \"78d70d99-629a-4211-9ead-66a16b766326\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " Feb 17 09:45:02 crc kubenswrapper[4848]: I0217 09:45:02.848838 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-ceilometer-compute-config-data-0\") pod \"78d70d99-629a-4211-9ead-66a16b766326\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " Feb 17 09:45:02 crc kubenswrapper[4848]: I0217 09:45:02.848921 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-telemetry-combined-ca-bundle\") pod \"78d70d99-629a-4211-9ead-66a16b766326\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " Feb 17 09:45:02 crc kubenswrapper[4848]: I0217 09:45:02.848955 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-ceilometer-compute-config-data-2\") pod \"78d70d99-629a-4211-9ead-66a16b766326\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " Feb 17 09:45:02 crc kubenswrapper[4848]: I0217 09:45:02.848984 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-ssh-key-openstack-edpm-ipam\") pod \"78d70d99-629a-4211-9ead-66a16b766326\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " Feb 17 09:45:02 crc kubenswrapper[4848]: I0217 09:45:02.849033 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-ceilometer-compute-config-data-1\") pod \"78d70d99-629a-4211-9ead-66a16b766326\" (UID: \"78d70d99-629a-4211-9ead-66a16b766326\") " Feb 17 09:45:02 crc kubenswrapper[4848]: I0217 09:45:02.854426 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "78d70d99-629a-4211-9ead-66a16b766326" (UID: "78d70d99-629a-4211-9ead-66a16b766326"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:45:02 crc kubenswrapper[4848]: I0217 09:45:02.855089 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78d70d99-629a-4211-9ead-66a16b766326-kube-api-access-rbrwp" (OuterVolumeSpecName: "kube-api-access-rbrwp") pod "78d70d99-629a-4211-9ead-66a16b766326" (UID: "78d70d99-629a-4211-9ead-66a16b766326"). InnerVolumeSpecName "kube-api-access-rbrwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:45:02 crc kubenswrapper[4848]: I0217 09:45:02.885417 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "78d70d99-629a-4211-9ead-66a16b766326" (UID: "78d70d99-629a-4211-9ead-66a16b766326"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:45:02 crc kubenswrapper[4848]: I0217 09:45:02.887347 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "78d70d99-629a-4211-9ead-66a16b766326" (UID: "78d70d99-629a-4211-9ead-66a16b766326"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:45:02 crc kubenswrapper[4848]: I0217 09:45:02.893543 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "78d70d99-629a-4211-9ead-66a16b766326" (UID: "78d70d99-629a-4211-9ead-66a16b766326"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:45:02 crc kubenswrapper[4848]: I0217 09:45:02.898604 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "78d70d99-629a-4211-9ead-66a16b766326" (UID: "78d70d99-629a-4211-9ead-66a16b766326"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:45:02 crc kubenswrapper[4848]: I0217 09:45:02.905709 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-inventory" (OuterVolumeSpecName: "inventory") pod "78d70d99-629a-4211-9ead-66a16b766326" (UID: "78d70d99-629a-4211-9ead-66a16b766326"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:45:02 crc kubenswrapper[4848]: I0217 09:45:02.952340 4848 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-inventory\") on node \"crc\" DevicePath \"\"" Feb 17 09:45:02 crc kubenswrapper[4848]: I0217 09:45:02.952402 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbrwp\" (UniqueName: \"kubernetes.io/projected/78d70d99-629a-4211-9ead-66a16b766326-kube-api-access-rbrwp\") on node \"crc\" DevicePath \"\"" Feb 17 09:45:02 crc kubenswrapper[4848]: I0217 09:45:02.952418 4848 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 17 09:45:02 crc kubenswrapper[4848]: I0217 09:45:02.952429 4848 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 09:45:02 crc kubenswrapper[4848]: I0217 09:45:02.952466 4848 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 17 09:45:02 crc kubenswrapper[4848]: I0217 09:45:02.952482 4848 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 17 09:45:02 crc kubenswrapper[4848]: I0217 09:45:02.952496 4848 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/78d70d99-629a-4211-9ead-66a16b766326-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 17 09:45:03 crc kubenswrapper[4848]: I0217 09:45:03.342310 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" Feb 17 09:45:03 crc kubenswrapper[4848]: I0217 09:45:03.342361 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-x4455" event={"ID":"78d70d99-629a-4211-9ead-66a16b766326","Type":"ContainerDied","Data":"058ddd475c360a9af201a9262443f2658bd25aa06d6127a2dbd33413642fc380"} Feb 17 09:45:03 crc kubenswrapper[4848]: I0217 09:45:03.344626 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="058ddd475c360a9af201a9262443f2658bd25aa06d6127a2dbd33413642fc380" Feb 17 09:45:03 crc kubenswrapper[4848]: I0217 09:45:03.620232 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522025-fqxsl" Feb 17 09:45:03 crc kubenswrapper[4848]: I0217 09:45:03.771597 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dd9wc\" (UniqueName: \"kubernetes.io/projected/e2d68e0e-8f51-456b-84e9-b8685a49adf7-kube-api-access-dd9wc\") pod \"e2d68e0e-8f51-456b-84e9-b8685a49adf7\" (UID: \"e2d68e0e-8f51-456b-84e9-b8685a49adf7\") " Feb 17 09:45:03 crc kubenswrapper[4848]: I0217 09:45:03.771658 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2d68e0e-8f51-456b-84e9-b8685a49adf7-config-volume\") pod \"e2d68e0e-8f51-456b-84e9-b8685a49adf7\" (UID: \"e2d68e0e-8f51-456b-84e9-b8685a49adf7\") " Feb 17 09:45:03 crc kubenswrapper[4848]: I0217 09:45:03.771899 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2d68e0e-8f51-456b-84e9-b8685a49adf7-secret-volume\") pod \"e2d68e0e-8f51-456b-84e9-b8685a49adf7\" (UID: \"e2d68e0e-8f51-456b-84e9-b8685a49adf7\") " Feb 17 09:45:03 crc kubenswrapper[4848]: I0217 09:45:03.773413 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2d68e0e-8f51-456b-84e9-b8685a49adf7-config-volume" (OuterVolumeSpecName: "config-volume") pod "e2d68e0e-8f51-456b-84e9-b8685a49adf7" (UID: "e2d68e0e-8f51-456b-84e9-b8685a49adf7"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:45:03 crc kubenswrapper[4848]: I0217 09:45:03.778711 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2d68e0e-8f51-456b-84e9-b8685a49adf7-kube-api-access-dd9wc" (OuterVolumeSpecName: "kube-api-access-dd9wc") pod "e2d68e0e-8f51-456b-84e9-b8685a49adf7" (UID: "e2d68e0e-8f51-456b-84e9-b8685a49adf7"). InnerVolumeSpecName "kube-api-access-dd9wc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:45:03 crc kubenswrapper[4848]: I0217 09:45:03.779335 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2d68e0e-8f51-456b-84e9-b8685a49adf7-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e2d68e0e-8f51-456b-84e9-b8685a49adf7" (UID: "e2d68e0e-8f51-456b-84e9-b8685a49adf7"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:45:03 crc kubenswrapper[4848]: I0217 09:45:03.874225 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dd9wc\" (UniqueName: \"kubernetes.io/projected/e2d68e0e-8f51-456b-84e9-b8685a49adf7-kube-api-access-dd9wc\") on node \"crc\" DevicePath \"\"" Feb 17 09:45:03 crc kubenswrapper[4848]: I0217 09:45:03.874262 4848 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2d68e0e-8f51-456b-84e9-b8685a49adf7-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 09:45:03 crc kubenswrapper[4848]: I0217 09:45:03.874276 4848 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2d68e0e-8f51-456b-84e9-b8685a49adf7-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 09:45:04 crc kubenswrapper[4848]: I0217 09:45:04.358715 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522025-fqxsl" event={"ID":"e2d68e0e-8f51-456b-84e9-b8685a49adf7","Type":"ContainerDied","Data":"eb840ec581bc571a32e25799c7fb6f2ef6ddf8ae03e795fa9599fa3d0d900ba0"} Feb 17 09:45:04 crc kubenswrapper[4848]: I0217 09:45:04.359197 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb840ec581bc571a32e25799c7fb6f2ef6ddf8ae03e795fa9599fa3d0d900ba0" Feb 17 09:45:04 crc kubenswrapper[4848]: I0217 09:45:04.358815 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522025-fqxsl" Feb 17 09:45:04 crc kubenswrapper[4848]: I0217 09:45:04.459446 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521980-llz2m"] Feb 17 09:45:04 crc kubenswrapper[4848]: I0217 09:45:04.473913 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521980-llz2m"] Feb 17 09:45:05 crc kubenswrapper[4848]: I0217 09:45:05.401441 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4d0b3bb-c027-4390-92ac-66aad8bf0d19" path="/var/lib/kubelet/pods/f4d0b3bb-c027-4390-92ac-66aad8bf0d19/volumes" Feb 17 09:45:18 crc kubenswrapper[4848]: I0217 09:45:18.772337 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:45:18 crc kubenswrapper[4848]: I0217 09:45:18.772997 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:45:18 crc kubenswrapper[4848]: I0217 09:45:18.773062 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:45:18 crc kubenswrapper[4848]: I0217 09:45:18.773949 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29"} pod="openshift-machine-config-operator/machine-config-daemon-stvnz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 09:45:18 crc kubenswrapper[4848]: I0217 09:45:18.774045 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" containerID="cri-o://b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" gracePeriod=600 Feb 17 09:45:18 crc kubenswrapper[4848]: E0217 09:45:18.937622 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:45:19 crc kubenswrapper[4848]: I0217 09:45:19.512327 4848 generic.go:334] "Generic (PLEG): container finished" podID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" exitCode=0 Feb 17 09:45:19 crc kubenswrapper[4848]: I0217 09:45:19.512414 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerDied","Data":"b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29"} Feb 17 09:45:19 crc kubenswrapper[4848]: I0217 09:45:19.512673 4848 scope.go:117] "RemoveContainer" containerID="7fd570aced053602ec77ca215312b917159892f88fee9de807e73bca6b19517b" Feb 17 09:45:19 crc kubenswrapper[4848]: I0217 09:45:19.513465 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:45:19 crc kubenswrapper[4848]: E0217 09:45:19.513784 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:45:28 crc kubenswrapper[4848]: I0217 09:45:28.685320 4848 scope.go:117] "RemoveContainer" containerID="e1825d86b0c3d5d7969f30c17dccb7437e7ab7ad20a72a76cc052bc896cef086" Feb 17 09:45:31 crc kubenswrapper[4848]: I0217 09:45:31.383708 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:45:31 crc kubenswrapper[4848]: E0217 09:45:31.384372 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:45:45 crc kubenswrapper[4848]: I0217 09:45:45.384038 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:45:45 crc kubenswrapper[4848]: E0217 09:45:45.385358 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.042578 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 17 09:45:57 crc kubenswrapper[4848]: E0217 09:45:57.043711 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2d68e0e-8f51-456b-84e9-b8685a49adf7" containerName="collect-profiles" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.043726 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2d68e0e-8f51-456b-84e9-b8685a49adf7" containerName="collect-profiles" Feb 17 09:45:57 crc kubenswrapper[4848]: E0217 09:45:57.043795 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78d70d99-629a-4211-9ead-66a16b766326" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.043804 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="78d70d99-629a-4211-9ead-66a16b766326" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.043992 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2d68e0e-8f51-456b-84e9-b8685a49adf7" containerName="collect-profiles" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.044014 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="78d70d99-629a-4211-9ead-66a16b766326" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.047156 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.049538 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bz9ck" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.050616 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d195d00b-8819-4a35-9e3c-6b4b21660400-config-data\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.050699 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d195d00b-8819-4a35-9e3c-6b4b21660400-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.051069 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d195d00b-8819-4a35-9e3c-6b4b21660400-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.051133 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.051838 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.055975 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.059563 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.153254 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqvlg\" (UniqueName: \"kubernetes.io/projected/d195d00b-8819-4a35-9e3c-6b4b21660400-kube-api-access-lqvlg\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.153350 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.153378 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d195d00b-8819-4a35-9e3c-6b4b21660400-config-data\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.153405 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d195d00b-8819-4a35-9e3c-6b4b21660400-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.153477 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d195d00b-8819-4a35-9e3c-6b4b21660400-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.153630 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d195d00b-8819-4a35-9e3c-6b4b21660400-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.153795 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d195d00b-8819-4a35-9e3c-6b4b21660400-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.153831 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d195d00b-8819-4a35-9e3c-6b4b21660400-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.153888 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d195d00b-8819-4a35-9e3c-6b4b21660400-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.154415 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d195d00b-8819-4a35-9e3c-6b4b21660400-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.156084 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d195d00b-8819-4a35-9e3c-6b4b21660400-config-data\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.159958 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d195d00b-8819-4a35-9e3c-6b4b21660400-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.256054 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d195d00b-8819-4a35-9e3c-6b4b21660400-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.256121 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d195d00b-8819-4a35-9e3c-6b4b21660400-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.256154 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqvlg\" (UniqueName: \"kubernetes.io/projected/d195d00b-8819-4a35-9e3c-6b4b21660400-kube-api-access-lqvlg\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.256225 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.256306 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d195d00b-8819-4a35-9e3c-6b4b21660400-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.256384 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d195d00b-8819-4a35-9e3c-6b4b21660400-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.256604 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d195d00b-8819-4a35-9e3c-6b4b21660400-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.257492 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.257551 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d195d00b-8819-4a35-9e3c-6b4b21660400-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.260955 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d195d00b-8819-4a35-9e3c-6b4b21660400-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.261126 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d195d00b-8819-4a35-9e3c-6b4b21660400-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.290075 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqvlg\" (UniqueName: \"kubernetes.io/projected/d195d00b-8819-4a35-9e3c-6b4b21660400-kube-api-access-lqvlg\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.291454 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.378698 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.383956 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:45:57 crc kubenswrapper[4848]: E0217 09:45:57.384304 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.823587 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 17 09:45:57 crc kubenswrapper[4848]: I0217 09:45:57.873813 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d195d00b-8819-4a35-9e3c-6b4b21660400","Type":"ContainerStarted","Data":"5a4b0bef0e7efddad88a208a71632fb466bf099068b1511d28a70faa4be4a84f"} Feb 17 09:46:12 crc kubenswrapper[4848]: I0217 09:46:12.383348 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:46:12 crc kubenswrapper[4848]: E0217 09:46:12.384097 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:46:24 crc kubenswrapper[4848]: I0217 09:46:24.383687 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:46:24 crc kubenswrapper[4848]: E0217 09:46:24.384335 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:46:25 crc kubenswrapper[4848]: E0217 09:46:25.957022 4848 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Feb 17 09:46:25 crc kubenswrapper[4848]: E0217 09:46:25.957622 4848 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lqvlg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(d195d00b-8819-4a35-9e3c-6b4b21660400): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 17 09:46:25 crc kubenswrapper[4848]: E0217 09:46:25.958966 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="d195d00b-8819-4a35-9e3c-6b4b21660400" Feb 17 09:46:26 crc kubenswrapper[4848]: E0217 09:46:26.391826 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="d195d00b-8819-4a35-9e3c-6b4b21660400" Feb 17 09:46:36 crc kubenswrapper[4848]: I0217 09:46:36.384358 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:46:36 crc kubenswrapper[4848]: E0217 09:46:36.385637 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:46:42 crc kubenswrapper[4848]: I0217 09:46:42.056463 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 17 09:46:43 crc kubenswrapper[4848]: I0217 09:46:43.580428 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d195d00b-8819-4a35-9e3c-6b4b21660400","Type":"ContainerStarted","Data":"334ba25df13b52868e3da44b549cd876738861fea404ae247cd920cf80a1717f"} Feb 17 09:46:43 crc kubenswrapper[4848]: I0217 09:46:43.608648 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=3.391790106 podStartE2EDuration="47.608627182s" podCreationTimestamp="2026-02-17 09:45:56 +0000 UTC" firstStartedPulling="2026-02-17 09:45:57.835584672 +0000 UTC m=+2435.378840318" lastFinishedPulling="2026-02-17 09:46:42.052421718 +0000 UTC m=+2479.595677394" observedRunningTime="2026-02-17 09:46:43.601543176 +0000 UTC m=+2481.144798842" watchObservedRunningTime="2026-02-17 09:46:43.608627182 +0000 UTC m=+2481.151882828" Feb 17 09:46:51 crc kubenswrapper[4848]: I0217 09:46:51.384341 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:46:51 crc kubenswrapper[4848]: E0217 09:46:51.385296 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:47:03 crc kubenswrapper[4848]: I0217 09:47:03.396914 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:47:03 crc kubenswrapper[4848]: E0217 09:47:03.398094 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:47:15 crc kubenswrapper[4848]: I0217 09:47:15.389556 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:47:15 crc kubenswrapper[4848]: E0217 09:47:15.390359 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:47:29 crc kubenswrapper[4848]: I0217 09:47:29.384715 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:47:29 crc kubenswrapper[4848]: E0217 09:47:29.386003 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:47:43 crc kubenswrapper[4848]: I0217 09:47:43.393610 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:47:43 crc kubenswrapper[4848]: E0217 09:47:43.394510 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:47:58 crc kubenswrapper[4848]: I0217 09:47:58.383139 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:47:58 crc kubenswrapper[4848]: E0217 09:47:58.384412 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:48:11 crc kubenswrapper[4848]: I0217 09:48:11.384379 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:48:11 crc kubenswrapper[4848]: E0217 09:48:11.385661 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:48:23 crc kubenswrapper[4848]: I0217 09:48:23.398098 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:48:23 crc kubenswrapper[4848]: E0217 09:48:23.399012 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:48:34 crc kubenswrapper[4848]: I0217 09:48:34.384659 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:48:34 crc kubenswrapper[4848]: E0217 09:48:34.386123 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:48:47 crc kubenswrapper[4848]: I0217 09:48:47.383658 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:48:47 crc kubenswrapper[4848]: E0217 09:48:47.384861 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:49:00 crc kubenswrapper[4848]: I0217 09:49:00.383437 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:49:00 crc kubenswrapper[4848]: E0217 09:49:00.384214 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:49:14 crc kubenswrapper[4848]: I0217 09:49:14.384561 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:49:14 crc kubenswrapper[4848]: E0217 09:49:14.385246 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:49:25 crc kubenswrapper[4848]: I0217 09:49:25.383008 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:49:25 crc kubenswrapper[4848]: E0217 09:49:25.383735 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:49:36 crc kubenswrapper[4848]: I0217 09:49:36.384752 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:49:36 crc kubenswrapper[4848]: E0217 09:49:36.385588 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:49:47 crc kubenswrapper[4848]: I0217 09:49:47.922901 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hwpbx"] Feb 17 09:49:47 crc kubenswrapper[4848]: I0217 09:49:47.927714 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hwpbx" Feb 17 09:49:47 crc kubenswrapper[4848]: I0217 09:49:47.960057 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hwpbx"] Feb 17 09:49:48 crc kubenswrapper[4848]: I0217 09:49:48.090640 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/001dc1a6-a9a1-4b4e-8164-bf588a51ff54-utilities\") pod \"redhat-operators-hwpbx\" (UID: \"001dc1a6-a9a1-4b4e-8164-bf588a51ff54\") " pod="openshift-marketplace/redhat-operators-hwpbx" Feb 17 09:49:48 crc kubenswrapper[4848]: I0217 09:49:48.090725 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4srn\" (UniqueName: \"kubernetes.io/projected/001dc1a6-a9a1-4b4e-8164-bf588a51ff54-kube-api-access-x4srn\") pod \"redhat-operators-hwpbx\" (UID: \"001dc1a6-a9a1-4b4e-8164-bf588a51ff54\") " pod="openshift-marketplace/redhat-operators-hwpbx" Feb 17 09:49:48 crc kubenswrapper[4848]: I0217 09:49:48.090865 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/001dc1a6-a9a1-4b4e-8164-bf588a51ff54-catalog-content\") pod \"redhat-operators-hwpbx\" (UID: \"001dc1a6-a9a1-4b4e-8164-bf588a51ff54\") " pod="openshift-marketplace/redhat-operators-hwpbx" Feb 17 09:49:48 crc kubenswrapper[4848]: I0217 09:49:48.192853 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4srn\" (UniqueName: \"kubernetes.io/projected/001dc1a6-a9a1-4b4e-8164-bf588a51ff54-kube-api-access-x4srn\") pod \"redhat-operators-hwpbx\" (UID: \"001dc1a6-a9a1-4b4e-8164-bf588a51ff54\") " pod="openshift-marketplace/redhat-operators-hwpbx" Feb 17 09:49:48 crc kubenswrapper[4848]: I0217 09:49:48.192998 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/001dc1a6-a9a1-4b4e-8164-bf588a51ff54-catalog-content\") pod \"redhat-operators-hwpbx\" (UID: \"001dc1a6-a9a1-4b4e-8164-bf588a51ff54\") " pod="openshift-marketplace/redhat-operators-hwpbx" Feb 17 09:49:48 crc kubenswrapper[4848]: I0217 09:49:48.193150 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/001dc1a6-a9a1-4b4e-8164-bf588a51ff54-utilities\") pod \"redhat-operators-hwpbx\" (UID: \"001dc1a6-a9a1-4b4e-8164-bf588a51ff54\") " pod="openshift-marketplace/redhat-operators-hwpbx" Feb 17 09:49:48 crc kubenswrapper[4848]: I0217 09:49:48.193811 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/001dc1a6-a9a1-4b4e-8164-bf588a51ff54-utilities\") pod \"redhat-operators-hwpbx\" (UID: \"001dc1a6-a9a1-4b4e-8164-bf588a51ff54\") " pod="openshift-marketplace/redhat-operators-hwpbx" Feb 17 09:49:48 crc kubenswrapper[4848]: I0217 09:49:48.193942 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/001dc1a6-a9a1-4b4e-8164-bf588a51ff54-catalog-content\") pod \"redhat-operators-hwpbx\" (UID: \"001dc1a6-a9a1-4b4e-8164-bf588a51ff54\") " pod="openshift-marketplace/redhat-operators-hwpbx" Feb 17 09:49:48 crc kubenswrapper[4848]: I0217 09:49:48.223183 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4srn\" (UniqueName: \"kubernetes.io/projected/001dc1a6-a9a1-4b4e-8164-bf588a51ff54-kube-api-access-x4srn\") pod \"redhat-operators-hwpbx\" (UID: \"001dc1a6-a9a1-4b4e-8164-bf588a51ff54\") " pod="openshift-marketplace/redhat-operators-hwpbx" Feb 17 09:49:48 crc kubenswrapper[4848]: I0217 09:49:48.273897 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hwpbx" Feb 17 09:49:48 crc kubenswrapper[4848]: I0217 09:49:48.762608 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hwpbx"] Feb 17 09:49:49 crc kubenswrapper[4848]: I0217 09:49:49.383875 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:49:49 crc kubenswrapper[4848]: E0217 09:49:49.384387 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:49:49 crc kubenswrapper[4848]: I0217 09:49:49.479342 4848 generic.go:334] "Generic (PLEG): container finished" podID="001dc1a6-a9a1-4b4e-8164-bf588a51ff54" containerID="ce71d8441bfb096c6e2f957304c0f993151fbb19be8e7542c984e2d27361072a" exitCode=0 Feb 17 09:49:49 crc kubenswrapper[4848]: I0217 09:49:49.479387 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwpbx" event={"ID":"001dc1a6-a9a1-4b4e-8164-bf588a51ff54","Type":"ContainerDied","Data":"ce71d8441bfb096c6e2f957304c0f993151fbb19be8e7542c984e2d27361072a"} Feb 17 09:49:49 crc kubenswrapper[4848]: I0217 09:49:49.479412 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwpbx" event={"ID":"001dc1a6-a9a1-4b4e-8164-bf588a51ff54","Type":"ContainerStarted","Data":"9e4baf02ff169a150d4051ee37ee2a55b07cc6ef9551f37c15da537e79db93b0"} Feb 17 09:49:49 crc kubenswrapper[4848]: I0217 09:49:49.482150 4848 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 09:49:50 crc kubenswrapper[4848]: I0217 09:49:50.489647 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwpbx" event={"ID":"001dc1a6-a9a1-4b4e-8164-bf588a51ff54","Type":"ContainerStarted","Data":"7c7e45ddbfdbfa9d509f0ed24e33ae786a99015a9c0ecbb897eeb2c8f11be309"} Feb 17 09:49:54 crc kubenswrapper[4848]: I0217 09:49:54.543544 4848 generic.go:334] "Generic (PLEG): container finished" podID="001dc1a6-a9a1-4b4e-8164-bf588a51ff54" containerID="7c7e45ddbfdbfa9d509f0ed24e33ae786a99015a9c0ecbb897eeb2c8f11be309" exitCode=0 Feb 17 09:49:54 crc kubenswrapper[4848]: I0217 09:49:54.543636 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwpbx" event={"ID":"001dc1a6-a9a1-4b4e-8164-bf588a51ff54","Type":"ContainerDied","Data":"7c7e45ddbfdbfa9d509f0ed24e33ae786a99015a9c0ecbb897eeb2c8f11be309"} Feb 17 09:49:55 crc kubenswrapper[4848]: I0217 09:49:55.557597 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwpbx" event={"ID":"001dc1a6-a9a1-4b4e-8164-bf588a51ff54","Type":"ContainerStarted","Data":"552c381972fd19263333bbd4f731d229923bb27cc43870ffaf3a650f632274a2"} Feb 17 09:49:55 crc kubenswrapper[4848]: I0217 09:49:55.584025 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hwpbx" podStartSLOduration=3.024738633 podStartE2EDuration="8.58400136s" podCreationTimestamp="2026-02-17 09:49:47 +0000 UTC" firstStartedPulling="2026-02-17 09:49:49.481755013 +0000 UTC m=+2667.025010669" lastFinishedPulling="2026-02-17 09:49:55.04101775 +0000 UTC m=+2672.584273396" observedRunningTime="2026-02-17 09:49:55.580861053 +0000 UTC m=+2673.124116719" watchObservedRunningTime="2026-02-17 09:49:55.58400136 +0000 UTC m=+2673.127257016" Feb 17 09:49:58 crc kubenswrapper[4848]: I0217 09:49:58.274891 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hwpbx" Feb 17 09:49:58 crc kubenswrapper[4848]: I0217 09:49:58.275355 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hwpbx" Feb 17 09:49:59 crc kubenswrapper[4848]: I0217 09:49:59.336827 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hwpbx" podUID="001dc1a6-a9a1-4b4e-8164-bf588a51ff54" containerName="registry-server" probeResult="failure" output=< Feb 17 09:49:59 crc kubenswrapper[4848]: timeout: failed to connect service ":50051" within 1s Feb 17 09:49:59 crc kubenswrapper[4848]: > Feb 17 09:50:04 crc kubenswrapper[4848]: I0217 09:50:04.383919 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:50:04 crc kubenswrapper[4848]: E0217 09:50:04.385051 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:50:09 crc kubenswrapper[4848]: I0217 09:50:09.317920 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hwpbx" podUID="001dc1a6-a9a1-4b4e-8164-bf588a51ff54" containerName="registry-server" probeResult="failure" output=< Feb 17 09:50:09 crc kubenswrapper[4848]: timeout: failed to connect service ":50051" within 1s Feb 17 09:50:09 crc kubenswrapper[4848]: > Feb 17 09:50:16 crc kubenswrapper[4848]: I0217 09:50:16.384905 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:50:16 crc kubenswrapper[4848]: E0217 09:50:16.385638 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:50:18 crc kubenswrapper[4848]: I0217 09:50:18.346583 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hwpbx" Feb 17 09:50:18 crc kubenswrapper[4848]: I0217 09:50:18.415213 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hwpbx" Feb 17 09:50:19 crc kubenswrapper[4848]: I0217 09:50:19.124098 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hwpbx"] Feb 17 09:50:19 crc kubenswrapper[4848]: I0217 09:50:19.796626 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hwpbx" podUID="001dc1a6-a9a1-4b4e-8164-bf588a51ff54" containerName="registry-server" containerID="cri-o://552c381972fd19263333bbd4f731d229923bb27cc43870ffaf3a650f632274a2" gracePeriod=2 Feb 17 09:50:20 crc kubenswrapper[4848]: I0217 09:50:20.281623 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hwpbx" Feb 17 09:50:20 crc kubenswrapper[4848]: I0217 09:50:20.342278 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/001dc1a6-a9a1-4b4e-8164-bf588a51ff54-utilities\") pod \"001dc1a6-a9a1-4b4e-8164-bf588a51ff54\" (UID: \"001dc1a6-a9a1-4b4e-8164-bf588a51ff54\") " Feb 17 09:50:20 crc kubenswrapper[4848]: I0217 09:50:20.342378 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4srn\" (UniqueName: \"kubernetes.io/projected/001dc1a6-a9a1-4b4e-8164-bf588a51ff54-kube-api-access-x4srn\") pod \"001dc1a6-a9a1-4b4e-8164-bf588a51ff54\" (UID: \"001dc1a6-a9a1-4b4e-8164-bf588a51ff54\") " Feb 17 09:50:20 crc kubenswrapper[4848]: I0217 09:50:20.342463 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/001dc1a6-a9a1-4b4e-8164-bf588a51ff54-catalog-content\") pod \"001dc1a6-a9a1-4b4e-8164-bf588a51ff54\" (UID: \"001dc1a6-a9a1-4b4e-8164-bf588a51ff54\") " Feb 17 09:50:20 crc kubenswrapper[4848]: I0217 09:50:20.343477 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/001dc1a6-a9a1-4b4e-8164-bf588a51ff54-utilities" (OuterVolumeSpecName: "utilities") pod "001dc1a6-a9a1-4b4e-8164-bf588a51ff54" (UID: "001dc1a6-a9a1-4b4e-8164-bf588a51ff54"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:50:20 crc kubenswrapper[4848]: I0217 09:50:20.350951 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/001dc1a6-a9a1-4b4e-8164-bf588a51ff54-kube-api-access-x4srn" (OuterVolumeSpecName: "kube-api-access-x4srn") pod "001dc1a6-a9a1-4b4e-8164-bf588a51ff54" (UID: "001dc1a6-a9a1-4b4e-8164-bf588a51ff54"). InnerVolumeSpecName "kube-api-access-x4srn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:50:20 crc kubenswrapper[4848]: I0217 09:50:20.445088 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/001dc1a6-a9a1-4b4e-8164-bf588a51ff54-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:50:20 crc kubenswrapper[4848]: I0217 09:50:20.445140 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4srn\" (UniqueName: \"kubernetes.io/projected/001dc1a6-a9a1-4b4e-8164-bf588a51ff54-kube-api-access-x4srn\") on node \"crc\" DevicePath \"\"" Feb 17 09:50:20 crc kubenswrapper[4848]: I0217 09:50:20.478565 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/001dc1a6-a9a1-4b4e-8164-bf588a51ff54-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "001dc1a6-a9a1-4b4e-8164-bf588a51ff54" (UID: "001dc1a6-a9a1-4b4e-8164-bf588a51ff54"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:50:20 crc kubenswrapper[4848]: I0217 09:50:20.546448 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/001dc1a6-a9a1-4b4e-8164-bf588a51ff54-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:50:20 crc kubenswrapper[4848]: I0217 09:50:20.807603 4848 generic.go:334] "Generic (PLEG): container finished" podID="001dc1a6-a9a1-4b4e-8164-bf588a51ff54" containerID="552c381972fd19263333bbd4f731d229923bb27cc43870ffaf3a650f632274a2" exitCode=0 Feb 17 09:50:20 crc kubenswrapper[4848]: I0217 09:50:20.807649 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwpbx" event={"ID":"001dc1a6-a9a1-4b4e-8164-bf588a51ff54","Type":"ContainerDied","Data":"552c381972fd19263333bbd4f731d229923bb27cc43870ffaf3a650f632274a2"} Feb 17 09:50:20 crc kubenswrapper[4848]: I0217 09:50:20.807681 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hwpbx" event={"ID":"001dc1a6-a9a1-4b4e-8164-bf588a51ff54","Type":"ContainerDied","Data":"9e4baf02ff169a150d4051ee37ee2a55b07cc6ef9551f37c15da537e79db93b0"} Feb 17 09:50:20 crc kubenswrapper[4848]: I0217 09:50:20.807704 4848 scope.go:117] "RemoveContainer" containerID="552c381972fd19263333bbd4f731d229923bb27cc43870ffaf3a650f632274a2" Feb 17 09:50:20 crc kubenswrapper[4848]: I0217 09:50:20.809921 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hwpbx" Feb 17 09:50:20 crc kubenswrapper[4848]: I0217 09:50:20.834609 4848 scope.go:117] "RemoveContainer" containerID="7c7e45ddbfdbfa9d509f0ed24e33ae786a99015a9c0ecbb897eeb2c8f11be309" Feb 17 09:50:20 crc kubenswrapper[4848]: I0217 09:50:20.855463 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hwpbx"] Feb 17 09:50:20 crc kubenswrapper[4848]: I0217 09:50:20.866354 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hwpbx"] Feb 17 09:50:20 crc kubenswrapper[4848]: I0217 09:50:20.868469 4848 scope.go:117] "RemoveContainer" containerID="ce71d8441bfb096c6e2f957304c0f993151fbb19be8e7542c984e2d27361072a" Feb 17 09:50:20 crc kubenswrapper[4848]: I0217 09:50:20.925150 4848 scope.go:117] "RemoveContainer" containerID="552c381972fd19263333bbd4f731d229923bb27cc43870ffaf3a650f632274a2" Feb 17 09:50:20 crc kubenswrapper[4848]: E0217 09:50:20.925635 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"552c381972fd19263333bbd4f731d229923bb27cc43870ffaf3a650f632274a2\": container with ID starting with 552c381972fd19263333bbd4f731d229923bb27cc43870ffaf3a650f632274a2 not found: ID does not exist" containerID="552c381972fd19263333bbd4f731d229923bb27cc43870ffaf3a650f632274a2" Feb 17 09:50:20 crc kubenswrapper[4848]: I0217 09:50:20.925684 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"552c381972fd19263333bbd4f731d229923bb27cc43870ffaf3a650f632274a2"} err="failed to get container status \"552c381972fd19263333bbd4f731d229923bb27cc43870ffaf3a650f632274a2\": rpc error: code = NotFound desc = could not find container \"552c381972fd19263333bbd4f731d229923bb27cc43870ffaf3a650f632274a2\": container with ID starting with 552c381972fd19263333bbd4f731d229923bb27cc43870ffaf3a650f632274a2 not found: ID does not exist" Feb 17 09:50:20 crc kubenswrapper[4848]: I0217 09:50:20.925711 4848 scope.go:117] "RemoveContainer" containerID="7c7e45ddbfdbfa9d509f0ed24e33ae786a99015a9c0ecbb897eeb2c8f11be309" Feb 17 09:50:20 crc kubenswrapper[4848]: E0217 09:50:20.927416 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c7e45ddbfdbfa9d509f0ed24e33ae786a99015a9c0ecbb897eeb2c8f11be309\": container with ID starting with 7c7e45ddbfdbfa9d509f0ed24e33ae786a99015a9c0ecbb897eeb2c8f11be309 not found: ID does not exist" containerID="7c7e45ddbfdbfa9d509f0ed24e33ae786a99015a9c0ecbb897eeb2c8f11be309" Feb 17 09:50:20 crc kubenswrapper[4848]: I0217 09:50:20.927448 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c7e45ddbfdbfa9d509f0ed24e33ae786a99015a9c0ecbb897eeb2c8f11be309"} err="failed to get container status \"7c7e45ddbfdbfa9d509f0ed24e33ae786a99015a9c0ecbb897eeb2c8f11be309\": rpc error: code = NotFound desc = could not find container \"7c7e45ddbfdbfa9d509f0ed24e33ae786a99015a9c0ecbb897eeb2c8f11be309\": container with ID starting with 7c7e45ddbfdbfa9d509f0ed24e33ae786a99015a9c0ecbb897eeb2c8f11be309 not found: ID does not exist" Feb 17 09:50:20 crc kubenswrapper[4848]: I0217 09:50:20.927468 4848 scope.go:117] "RemoveContainer" containerID="ce71d8441bfb096c6e2f957304c0f993151fbb19be8e7542c984e2d27361072a" Feb 17 09:50:20 crc kubenswrapper[4848]: E0217 09:50:20.927749 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce71d8441bfb096c6e2f957304c0f993151fbb19be8e7542c984e2d27361072a\": container with ID starting with ce71d8441bfb096c6e2f957304c0f993151fbb19be8e7542c984e2d27361072a not found: ID does not exist" containerID="ce71d8441bfb096c6e2f957304c0f993151fbb19be8e7542c984e2d27361072a" Feb 17 09:50:20 crc kubenswrapper[4848]: I0217 09:50:20.927835 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce71d8441bfb096c6e2f957304c0f993151fbb19be8e7542c984e2d27361072a"} err="failed to get container status \"ce71d8441bfb096c6e2f957304c0f993151fbb19be8e7542c984e2d27361072a\": rpc error: code = NotFound desc = could not find container \"ce71d8441bfb096c6e2f957304c0f993151fbb19be8e7542c984e2d27361072a\": container with ID starting with ce71d8441bfb096c6e2f957304c0f993151fbb19be8e7542c984e2d27361072a not found: ID does not exist" Feb 17 09:50:21 crc kubenswrapper[4848]: I0217 09:50:21.397595 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="001dc1a6-a9a1-4b4e-8164-bf588a51ff54" path="/var/lib/kubelet/pods/001dc1a6-a9a1-4b4e-8164-bf588a51ff54/volumes" Feb 17 09:50:29 crc kubenswrapper[4848]: I0217 09:50:29.383674 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:50:29 crc kubenswrapper[4848]: I0217 09:50:29.906832 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerStarted","Data":"54b4c054c8795f93009bd8a79273fe782e5dc24d5d27896029a69001600777f4"} Feb 17 09:51:25 crc kubenswrapper[4848]: I0217 09:51:25.728312 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hwq6m"] Feb 17 09:51:25 crc kubenswrapper[4848]: E0217 09:51:25.732244 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="001dc1a6-a9a1-4b4e-8164-bf588a51ff54" containerName="extract-utilities" Feb 17 09:51:25 crc kubenswrapper[4848]: I0217 09:51:25.732278 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="001dc1a6-a9a1-4b4e-8164-bf588a51ff54" containerName="extract-utilities" Feb 17 09:51:25 crc kubenswrapper[4848]: E0217 09:51:25.732301 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="001dc1a6-a9a1-4b4e-8164-bf588a51ff54" containerName="registry-server" Feb 17 09:51:25 crc kubenswrapper[4848]: I0217 09:51:25.732309 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="001dc1a6-a9a1-4b4e-8164-bf588a51ff54" containerName="registry-server" Feb 17 09:51:25 crc kubenswrapper[4848]: E0217 09:51:25.732319 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="001dc1a6-a9a1-4b4e-8164-bf588a51ff54" containerName="extract-content" Feb 17 09:51:25 crc kubenswrapper[4848]: I0217 09:51:25.732326 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="001dc1a6-a9a1-4b4e-8164-bf588a51ff54" containerName="extract-content" Feb 17 09:51:25 crc kubenswrapper[4848]: I0217 09:51:25.732557 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="001dc1a6-a9a1-4b4e-8164-bf588a51ff54" containerName="registry-server" Feb 17 09:51:25 crc kubenswrapper[4848]: I0217 09:51:25.735446 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hwq6m" Feb 17 09:51:25 crc kubenswrapper[4848]: I0217 09:51:25.768133 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hwq6m"] Feb 17 09:51:25 crc kubenswrapper[4848]: I0217 09:51:25.862486 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzpv6\" (UniqueName: \"kubernetes.io/projected/6759c187-9c23-4d8e-8c9f-9cb10756c7f9-kube-api-access-fzpv6\") pod \"community-operators-hwq6m\" (UID: \"6759c187-9c23-4d8e-8c9f-9cb10756c7f9\") " pod="openshift-marketplace/community-operators-hwq6m" Feb 17 09:51:25 crc kubenswrapper[4848]: I0217 09:51:25.862543 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6759c187-9c23-4d8e-8c9f-9cb10756c7f9-catalog-content\") pod \"community-operators-hwq6m\" (UID: \"6759c187-9c23-4d8e-8c9f-9cb10756c7f9\") " pod="openshift-marketplace/community-operators-hwq6m" Feb 17 09:51:25 crc kubenswrapper[4848]: I0217 09:51:25.862610 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6759c187-9c23-4d8e-8c9f-9cb10756c7f9-utilities\") pod \"community-operators-hwq6m\" (UID: \"6759c187-9c23-4d8e-8c9f-9cb10756c7f9\") " pod="openshift-marketplace/community-operators-hwq6m" Feb 17 09:51:25 crc kubenswrapper[4848]: I0217 09:51:25.964430 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzpv6\" (UniqueName: \"kubernetes.io/projected/6759c187-9c23-4d8e-8c9f-9cb10756c7f9-kube-api-access-fzpv6\") pod \"community-operators-hwq6m\" (UID: \"6759c187-9c23-4d8e-8c9f-9cb10756c7f9\") " pod="openshift-marketplace/community-operators-hwq6m" Feb 17 09:51:25 crc kubenswrapper[4848]: I0217 09:51:25.964490 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6759c187-9c23-4d8e-8c9f-9cb10756c7f9-catalog-content\") pod \"community-operators-hwq6m\" (UID: \"6759c187-9c23-4d8e-8c9f-9cb10756c7f9\") " pod="openshift-marketplace/community-operators-hwq6m" Feb 17 09:51:25 crc kubenswrapper[4848]: I0217 09:51:25.964548 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6759c187-9c23-4d8e-8c9f-9cb10756c7f9-utilities\") pod \"community-operators-hwq6m\" (UID: \"6759c187-9c23-4d8e-8c9f-9cb10756c7f9\") " pod="openshift-marketplace/community-operators-hwq6m" Feb 17 09:51:25 crc kubenswrapper[4848]: I0217 09:51:25.965702 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6759c187-9c23-4d8e-8c9f-9cb10756c7f9-utilities\") pod \"community-operators-hwq6m\" (UID: \"6759c187-9c23-4d8e-8c9f-9cb10756c7f9\") " pod="openshift-marketplace/community-operators-hwq6m" Feb 17 09:51:25 crc kubenswrapper[4848]: I0217 09:51:25.965995 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6759c187-9c23-4d8e-8c9f-9cb10756c7f9-catalog-content\") pod \"community-operators-hwq6m\" (UID: \"6759c187-9c23-4d8e-8c9f-9cb10756c7f9\") " pod="openshift-marketplace/community-operators-hwq6m" Feb 17 09:51:25 crc kubenswrapper[4848]: I0217 09:51:25.992947 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzpv6\" (UniqueName: \"kubernetes.io/projected/6759c187-9c23-4d8e-8c9f-9cb10756c7f9-kube-api-access-fzpv6\") pod \"community-operators-hwq6m\" (UID: \"6759c187-9c23-4d8e-8c9f-9cb10756c7f9\") " pod="openshift-marketplace/community-operators-hwq6m" Feb 17 09:51:26 crc kubenswrapper[4848]: I0217 09:51:26.074906 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hwq6m" Feb 17 09:51:26 crc kubenswrapper[4848]: I0217 09:51:26.621287 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hwq6m"] Feb 17 09:51:27 crc kubenswrapper[4848]: I0217 09:51:27.482357 4848 generic.go:334] "Generic (PLEG): container finished" podID="6759c187-9c23-4d8e-8c9f-9cb10756c7f9" containerID="e865756c016dd59a40cdfb649d368c88a980f61c0632c8dc6d900dc45f20cc89" exitCode=0 Feb 17 09:51:27 crc kubenswrapper[4848]: I0217 09:51:27.482414 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwq6m" event={"ID":"6759c187-9c23-4d8e-8c9f-9cb10756c7f9","Type":"ContainerDied","Data":"e865756c016dd59a40cdfb649d368c88a980f61c0632c8dc6d900dc45f20cc89"} Feb 17 09:51:27 crc kubenswrapper[4848]: I0217 09:51:27.482446 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwq6m" event={"ID":"6759c187-9c23-4d8e-8c9f-9cb10756c7f9","Type":"ContainerStarted","Data":"869266dd4def7a4faa3a60791ca5f326f26e48b90808c238ce64bc4ee3e866ff"} Feb 17 09:51:28 crc kubenswrapper[4848]: I0217 09:51:28.491010 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwq6m" event={"ID":"6759c187-9c23-4d8e-8c9f-9cb10756c7f9","Type":"ContainerStarted","Data":"0f160d7a20b9ae1cf2f2901acce2bb1dd29961e2ec64b87abf6f82073df9d6b1"} Feb 17 09:51:29 crc kubenswrapper[4848]: I0217 09:51:29.500833 4848 generic.go:334] "Generic (PLEG): container finished" podID="6759c187-9c23-4d8e-8c9f-9cb10756c7f9" containerID="0f160d7a20b9ae1cf2f2901acce2bb1dd29961e2ec64b87abf6f82073df9d6b1" exitCode=0 Feb 17 09:51:29 crc kubenswrapper[4848]: I0217 09:51:29.500919 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwq6m" event={"ID":"6759c187-9c23-4d8e-8c9f-9cb10756c7f9","Type":"ContainerDied","Data":"0f160d7a20b9ae1cf2f2901acce2bb1dd29961e2ec64b87abf6f82073df9d6b1"} Feb 17 09:51:30 crc kubenswrapper[4848]: I0217 09:51:30.510049 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwq6m" event={"ID":"6759c187-9c23-4d8e-8c9f-9cb10756c7f9","Type":"ContainerStarted","Data":"86d5f898e5a5e38c69847d479e4ab9f12113928e3e98916369131febee9e6009"} Feb 17 09:51:30 crc kubenswrapper[4848]: I0217 09:51:30.538024 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hwq6m" podStartSLOduration=3.117300388 podStartE2EDuration="5.538008017s" podCreationTimestamp="2026-02-17 09:51:25 +0000 UTC" firstStartedPulling="2026-02-17 09:51:27.485829859 +0000 UTC m=+2765.029085505" lastFinishedPulling="2026-02-17 09:51:29.906537488 +0000 UTC m=+2767.449793134" observedRunningTime="2026-02-17 09:51:30.527525539 +0000 UTC m=+2768.070781205" watchObservedRunningTime="2026-02-17 09:51:30.538008017 +0000 UTC m=+2768.081263663" Feb 17 09:51:36 crc kubenswrapper[4848]: I0217 09:51:36.075026 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hwq6m" Feb 17 09:51:36 crc kubenswrapper[4848]: I0217 09:51:36.075670 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hwq6m" Feb 17 09:51:36 crc kubenswrapper[4848]: I0217 09:51:36.161617 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hwq6m" Feb 17 09:51:36 crc kubenswrapper[4848]: I0217 09:51:36.613799 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hwq6m" Feb 17 09:51:36 crc kubenswrapper[4848]: I0217 09:51:36.664966 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hwq6m"] Feb 17 09:51:38 crc kubenswrapper[4848]: I0217 09:51:38.584671 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hwq6m" podUID="6759c187-9c23-4d8e-8c9f-9cb10756c7f9" containerName="registry-server" containerID="cri-o://86d5f898e5a5e38c69847d479e4ab9f12113928e3e98916369131febee9e6009" gracePeriod=2 Feb 17 09:51:39 crc kubenswrapper[4848]: I0217 09:51:39.143047 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hwq6m" Feb 17 09:51:39 crc kubenswrapper[4848]: I0217 09:51:39.214779 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzpv6\" (UniqueName: \"kubernetes.io/projected/6759c187-9c23-4d8e-8c9f-9cb10756c7f9-kube-api-access-fzpv6\") pod \"6759c187-9c23-4d8e-8c9f-9cb10756c7f9\" (UID: \"6759c187-9c23-4d8e-8c9f-9cb10756c7f9\") " Feb 17 09:51:39 crc kubenswrapper[4848]: I0217 09:51:39.214839 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6759c187-9c23-4d8e-8c9f-9cb10756c7f9-utilities\") pod \"6759c187-9c23-4d8e-8c9f-9cb10756c7f9\" (UID: \"6759c187-9c23-4d8e-8c9f-9cb10756c7f9\") " Feb 17 09:51:39 crc kubenswrapper[4848]: I0217 09:51:39.214923 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6759c187-9c23-4d8e-8c9f-9cb10756c7f9-catalog-content\") pod \"6759c187-9c23-4d8e-8c9f-9cb10756c7f9\" (UID: \"6759c187-9c23-4d8e-8c9f-9cb10756c7f9\") " Feb 17 09:51:39 crc kubenswrapper[4848]: I0217 09:51:39.216104 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6759c187-9c23-4d8e-8c9f-9cb10756c7f9-utilities" (OuterVolumeSpecName: "utilities") pod "6759c187-9c23-4d8e-8c9f-9cb10756c7f9" (UID: "6759c187-9c23-4d8e-8c9f-9cb10756c7f9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:51:39 crc kubenswrapper[4848]: I0217 09:51:39.226982 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6759c187-9c23-4d8e-8c9f-9cb10756c7f9-kube-api-access-fzpv6" (OuterVolumeSpecName: "kube-api-access-fzpv6") pod "6759c187-9c23-4d8e-8c9f-9cb10756c7f9" (UID: "6759c187-9c23-4d8e-8c9f-9cb10756c7f9"). InnerVolumeSpecName "kube-api-access-fzpv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:51:39 crc kubenswrapper[4848]: I0217 09:51:39.276925 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6759c187-9c23-4d8e-8c9f-9cb10756c7f9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6759c187-9c23-4d8e-8c9f-9cb10756c7f9" (UID: "6759c187-9c23-4d8e-8c9f-9cb10756c7f9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:51:39 crc kubenswrapper[4848]: I0217 09:51:39.316378 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzpv6\" (UniqueName: \"kubernetes.io/projected/6759c187-9c23-4d8e-8c9f-9cb10756c7f9-kube-api-access-fzpv6\") on node \"crc\" DevicePath \"\"" Feb 17 09:51:39 crc kubenswrapper[4848]: I0217 09:51:39.316419 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6759c187-9c23-4d8e-8c9f-9cb10756c7f9-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:51:39 crc kubenswrapper[4848]: I0217 09:51:39.316430 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6759c187-9c23-4d8e-8c9f-9cb10756c7f9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:51:39 crc kubenswrapper[4848]: I0217 09:51:39.602560 4848 generic.go:334] "Generic (PLEG): container finished" podID="6759c187-9c23-4d8e-8c9f-9cb10756c7f9" containerID="86d5f898e5a5e38c69847d479e4ab9f12113928e3e98916369131febee9e6009" exitCode=0 Feb 17 09:51:39 crc kubenswrapper[4848]: I0217 09:51:39.602638 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwq6m" event={"ID":"6759c187-9c23-4d8e-8c9f-9cb10756c7f9","Type":"ContainerDied","Data":"86d5f898e5a5e38c69847d479e4ab9f12113928e3e98916369131febee9e6009"} Feb 17 09:51:39 crc kubenswrapper[4848]: I0217 09:51:39.602681 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hwq6m" Feb 17 09:51:39 crc kubenswrapper[4848]: I0217 09:51:39.602721 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hwq6m" event={"ID":"6759c187-9c23-4d8e-8c9f-9cb10756c7f9","Type":"ContainerDied","Data":"869266dd4def7a4faa3a60791ca5f326f26e48b90808c238ce64bc4ee3e866ff"} Feb 17 09:51:39 crc kubenswrapper[4848]: I0217 09:51:39.602785 4848 scope.go:117] "RemoveContainer" containerID="86d5f898e5a5e38c69847d479e4ab9f12113928e3e98916369131febee9e6009" Feb 17 09:51:39 crc kubenswrapper[4848]: I0217 09:51:39.643828 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hwq6m"] Feb 17 09:51:39 crc kubenswrapper[4848]: I0217 09:51:39.651248 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hwq6m"] Feb 17 09:51:39 crc kubenswrapper[4848]: I0217 09:51:39.653485 4848 scope.go:117] "RemoveContainer" containerID="0f160d7a20b9ae1cf2f2901acce2bb1dd29961e2ec64b87abf6f82073df9d6b1" Feb 17 09:51:39 crc kubenswrapper[4848]: I0217 09:51:39.681556 4848 scope.go:117] "RemoveContainer" containerID="e865756c016dd59a40cdfb649d368c88a980f61c0632c8dc6d900dc45f20cc89" Feb 17 09:51:39 crc kubenswrapper[4848]: I0217 09:51:39.756036 4848 scope.go:117] "RemoveContainer" containerID="86d5f898e5a5e38c69847d479e4ab9f12113928e3e98916369131febee9e6009" Feb 17 09:51:39 crc kubenswrapper[4848]: E0217 09:51:39.756550 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86d5f898e5a5e38c69847d479e4ab9f12113928e3e98916369131febee9e6009\": container with ID starting with 86d5f898e5a5e38c69847d479e4ab9f12113928e3e98916369131febee9e6009 not found: ID does not exist" containerID="86d5f898e5a5e38c69847d479e4ab9f12113928e3e98916369131febee9e6009" Feb 17 09:51:39 crc kubenswrapper[4848]: I0217 09:51:39.756585 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86d5f898e5a5e38c69847d479e4ab9f12113928e3e98916369131febee9e6009"} err="failed to get container status \"86d5f898e5a5e38c69847d479e4ab9f12113928e3e98916369131febee9e6009\": rpc error: code = NotFound desc = could not find container \"86d5f898e5a5e38c69847d479e4ab9f12113928e3e98916369131febee9e6009\": container with ID starting with 86d5f898e5a5e38c69847d479e4ab9f12113928e3e98916369131febee9e6009 not found: ID does not exist" Feb 17 09:51:39 crc kubenswrapper[4848]: I0217 09:51:39.756607 4848 scope.go:117] "RemoveContainer" containerID="0f160d7a20b9ae1cf2f2901acce2bb1dd29961e2ec64b87abf6f82073df9d6b1" Feb 17 09:51:39 crc kubenswrapper[4848]: E0217 09:51:39.757684 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f160d7a20b9ae1cf2f2901acce2bb1dd29961e2ec64b87abf6f82073df9d6b1\": container with ID starting with 0f160d7a20b9ae1cf2f2901acce2bb1dd29961e2ec64b87abf6f82073df9d6b1 not found: ID does not exist" containerID="0f160d7a20b9ae1cf2f2901acce2bb1dd29961e2ec64b87abf6f82073df9d6b1" Feb 17 09:51:39 crc kubenswrapper[4848]: I0217 09:51:39.757706 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f160d7a20b9ae1cf2f2901acce2bb1dd29961e2ec64b87abf6f82073df9d6b1"} err="failed to get container status \"0f160d7a20b9ae1cf2f2901acce2bb1dd29961e2ec64b87abf6f82073df9d6b1\": rpc error: code = NotFound desc = could not find container \"0f160d7a20b9ae1cf2f2901acce2bb1dd29961e2ec64b87abf6f82073df9d6b1\": container with ID starting with 0f160d7a20b9ae1cf2f2901acce2bb1dd29961e2ec64b87abf6f82073df9d6b1 not found: ID does not exist" Feb 17 09:51:39 crc kubenswrapper[4848]: I0217 09:51:39.757718 4848 scope.go:117] "RemoveContainer" containerID="e865756c016dd59a40cdfb649d368c88a980f61c0632c8dc6d900dc45f20cc89" Feb 17 09:51:39 crc kubenswrapper[4848]: E0217 09:51:39.758050 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e865756c016dd59a40cdfb649d368c88a980f61c0632c8dc6d900dc45f20cc89\": container with ID starting with e865756c016dd59a40cdfb649d368c88a980f61c0632c8dc6d900dc45f20cc89 not found: ID does not exist" containerID="e865756c016dd59a40cdfb649d368c88a980f61c0632c8dc6d900dc45f20cc89" Feb 17 09:51:39 crc kubenswrapper[4848]: I0217 09:51:39.758076 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e865756c016dd59a40cdfb649d368c88a980f61c0632c8dc6d900dc45f20cc89"} err="failed to get container status \"e865756c016dd59a40cdfb649d368c88a980f61c0632c8dc6d900dc45f20cc89\": rpc error: code = NotFound desc = could not find container \"e865756c016dd59a40cdfb649d368c88a980f61c0632c8dc6d900dc45f20cc89\": container with ID starting with e865756c016dd59a40cdfb649d368c88a980f61c0632c8dc6d900dc45f20cc89 not found: ID does not exist" Feb 17 09:51:41 crc kubenswrapper[4848]: I0217 09:51:41.397047 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6759c187-9c23-4d8e-8c9f-9cb10756c7f9" path="/var/lib/kubelet/pods/6759c187-9c23-4d8e-8c9f-9cb10756c7f9/volumes" Feb 17 09:52:48 crc kubenswrapper[4848]: I0217 09:52:48.771128 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:52:48 crc kubenswrapper[4848]: I0217 09:52:48.771601 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:53:04 crc kubenswrapper[4848]: I0217 09:53:04.114903 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5v6mg"] Feb 17 09:53:04 crc kubenswrapper[4848]: E0217 09:53:04.115964 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6759c187-9c23-4d8e-8c9f-9cb10756c7f9" containerName="extract-utilities" Feb 17 09:53:04 crc kubenswrapper[4848]: I0217 09:53:04.115983 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="6759c187-9c23-4d8e-8c9f-9cb10756c7f9" containerName="extract-utilities" Feb 17 09:53:04 crc kubenswrapper[4848]: E0217 09:53:04.115997 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6759c187-9c23-4d8e-8c9f-9cb10756c7f9" containerName="registry-server" Feb 17 09:53:04 crc kubenswrapper[4848]: I0217 09:53:04.116004 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="6759c187-9c23-4d8e-8c9f-9cb10756c7f9" containerName="registry-server" Feb 17 09:53:04 crc kubenswrapper[4848]: E0217 09:53:04.116016 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6759c187-9c23-4d8e-8c9f-9cb10756c7f9" containerName="extract-content" Feb 17 09:53:04 crc kubenswrapper[4848]: I0217 09:53:04.116023 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="6759c187-9c23-4d8e-8c9f-9cb10756c7f9" containerName="extract-content" Feb 17 09:53:04 crc kubenswrapper[4848]: I0217 09:53:04.116245 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="6759c187-9c23-4d8e-8c9f-9cb10756c7f9" containerName="registry-server" Feb 17 09:53:04 crc kubenswrapper[4848]: I0217 09:53:04.117944 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5v6mg" Feb 17 09:53:04 crc kubenswrapper[4848]: I0217 09:53:04.127640 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5v6mg"] Feb 17 09:53:04 crc kubenswrapper[4848]: I0217 09:53:04.162055 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ac6e362-0b8a-4f1a-8776-b1f4e01b724d-utilities\") pod \"certified-operators-5v6mg\" (UID: \"9ac6e362-0b8a-4f1a-8776-b1f4e01b724d\") " pod="openshift-marketplace/certified-operators-5v6mg" Feb 17 09:53:04 crc kubenswrapper[4848]: I0217 09:53:04.162169 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shjt5\" (UniqueName: \"kubernetes.io/projected/9ac6e362-0b8a-4f1a-8776-b1f4e01b724d-kube-api-access-shjt5\") pod \"certified-operators-5v6mg\" (UID: \"9ac6e362-0b8a-4f1a-8776-b1f4e01b724d\") " pod="openshift-marketplace/certified-operators-5v6mg" Feb 17 09:53:04 crc kubenswrapper[4848]: I0217 09:53:04.162282 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ac6e362-0b8a-4f1a-8776-b1f4e01b724d-catalog-content\") pod \"certified-operators-5v6mg\" (UID: \"9ac6e362-0b8a-4f1a-8776-b1f4e01b724d\") " pod="openshift-marketplace/certified-operators-5v6mg" Feb 17 09:53:04 crc kubenswrapper[4848]: I0217 09:53:04.264428 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ac6e362-0b8a-4f1a-8776-b1f4e01b724d-catalog-content\") pod \"certified-operators-5v6mg\" (UID: \"9ac6e362-0b8a-4f1a-8776-b1f4e01b724d\") " pod="openshift-marketplace/certified-operators-5v6mg" Feb 17 09:53:04 crc kubenswrapper[4848]: I0217 09:53:04.264892 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ac6e362-0b8a-4f1a-8776-b1f4e01b724d-utilities\") pod \"certified-operators-5v6mg\" (UID: \"9ac6e362-0b8a-4f1a-8776-b1f4e01b724d\") " pod="openshift-marketplace/certified-operators-5v6mg" Feb 17 09:53:04 crc kubenswrapper[4848]: I0217 09:53:04.264996 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ac6e362-0b8a-4f1a-8776-b1f4e01b724d-catalog-content\") pod \"certified-operators-5v6mg\" (UID: \"9ac6e362-0b8a-4f1a-8776-b1f4e01b724d\") " pod="openshift-marketplace/certified-operators-5v6mg" Feb 17 09:53:04 crc kubenswrapper[4848]: I0217 09:53:04.265089 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shjt5\" (UniqueName: \"kubernetes.io/projected/9ac6e362-0b8a-4f1a-8776-b1f4e01b724d-kube-api-access-shjt5\") pod \"certified-operators-5v6mg\" (UID: \"9ac6e362-0b8a-4f1a-8776-b1f4e01b724d\") " pod="openshift-marketplace/certified-operators-5v6mg" Feb 17 09:53:04 crc kubenswrapper[4848]: I0217 09:53:04.265160 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ac6e362-0b8a-4f1a-8776-b1f4e01b724d-utilities\") pod \"certified-operators-5v6mg\" (UID: \"9ac6e362-0b8a-4f1a-8776-b1f4e01b724d\") " pod="openshift-marketplace/certified-operators-5v6mg" Feb 17 09:53:04 crc kubenswrapper[4848]: I0217 09:53:04.291789 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shjt5\" (UniqueName: \"kubernetes.io/projected/9ac6e362-0b8a-4f1a-8776-b1f4e01b724d-kube-api-access-shjt5\") pod \"certified-operators-5v6mg\" (UID: \"9ac6e362-0b8a-4f1a-8776-b1f4e01b724d\") " pod="openshift-marketplace/certified-operators-5v6mg" Feb 17 09:53:04 crc kubenswrapper[4848]: I0217 09:53:04.440821 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5v6mg" Feb 17 09:53:04 crc kubenswrapper[4848]: I0217 09:53:04.915186 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5v6mg"] Feb 17 09:53:04 crc kubenswrapper[4848]: W0217 09:53:04.928936 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ac6e362_0b8a_4f1a_8776_b1f4e01b724d.slice/crio-17dd5068a7ae55268ebe83712440bac930ce34290324267c85c8d8eebf830ab3 WatchSource:0}: Error finding container 17dd5068a7ae55268ebe83712440bac930ce34290324267c85c8d8eebf830ab3: Status 404 returned error can't find the container with id 17dd5068a7ae55268ebe83712440bac930ce34290324267c85c8d8eebf830ab3 Feb 17 09:53:05 crc kubenswrapper[4848]: I0217 09:53:05.479818 4848 generic.go:334] "Generic (PLEG): container finished" podID="9ac6e362-0b8a-4f1a-8776-b1f4e01b724d" containerID="069af5d82ffa597a9f07ae6ddd9e951875f991d62be0f930a3685f3c6e1f4da1" exitCode=0 Feb 17 09:53:05 crc kubenswrapper[4848]: I0217 09:53:05.479928 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v6mg" event={"ID":"9ac6e362-0b8a-4f1a-8776-b1f4e01b724d","Type":"ContainerDied","Data":"069af5d82ffa597a9f07ae6ddd9e951875f991d62be0f930a3685f3c6e1f4da1"} Feb 17 09:53:05 crc kubenswrapper[4848]: I0217 09:53:05.480151 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v6mg" event={"ID":"9ac6e362-0b8a-4f1a-8776-b1f4e01b724d","Type":"ContainerStarted","Data":"17dd5068a7ae55268ebe83712440bac930ce34290324267c85c8d8eebf830ab3"} Feb 17 09:53:06 crc kubenswrapper[4848]: I0217 09:53:06.490647 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v6mg" event={"ID":"9ac6e362-0b8a-4f1a-8776-b1f4e01b724d","Type":"ContainerStarted","Data":"12af020316140a9593fd592bc37f55b85d7fa531ea4edbe71315ce070c89baef"} Feb 17 09:53:07 crc kubenswrapper[4848]: I0217 09:53:07.500511 4848 generic.go:334] "Generic (PLEG): container finished" podID="9ac6e362-0b8a-4f1a-8776-b1f4e01b724d" containerID="12af020316140a9593fd592bc37f55b85d7fa531ea4edbe71315ce070c89baef" exitCode=0 Feb 17 09:53:07 crc kubenswrapper[4848]: I0217 09:53:07.500556 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v6mg" event={"ID":"9ac6e362-0b8a-4f1a-8776-b1f4e01b724d","Type":"ContainerDied","Data":"12af020316140a9593fd592bc37f55b85d7fa531ea4edbe71315ce070c89baef"} Feb 17 09:53:08 crc kubenswrapper[4848]: I0217 09:53:08.512960 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v6mg" event={"ID":"9ac6e362-0b8a-4f1a-8776-b1f4e01b724d","Type":"ContainerStarted","Data":"4271aec99abe43108b7874ce0e6e4b5e45e92cbbb0bfa4a29468637bf38a5895"} Feb 17 09:53:08 crc kubenswrapper[4848]: I0217 09:53:08.536446 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5v6mg" podStartSLOduration=2.107210293 podStartE2EDuration="4.536428285s" podCreationTimestamp="2026-02-17 09:53:04 +0000 UTC" firstStartedPulling="2026-02-17 09:53:05.48224072 +0000 UTC m=+2863.025496376" lastFinishedPulling="2026-02-17 09:53:07.911458712 +0000 UTC m=+2865.454714368" observedRunningTime="2026-02-17 09:53:08.530588479 +0000 UTC m=+2866.073844145" watchObservedRunningTime="2026-02-17 09:53:08.536428285 +0000 UTC m=+2866.079683931" Feb 17 09:53:14 crc kubenswrapper[4848]: I0217 09:53:14.442687 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5v6mg" Feb 17 09:53:14 crc kubenswrapper[4848]: I0217 09:53:14.443270 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5v6mg" Feb 17 09:53:14 crc kubenswrapper[4848]: I0217 09:53:14.509837 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5v6mg" Feb 17 09:53:14 crc kubenswrapper[4848]: I0217 09:53:14.635320 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5v6mg" Feb 17 09:53:14 crc kubenswrapper[4848]: I0217 09:53:14.760282 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5v6mg"] Feb 17 09:53:16 crc kubenswrapper[4848]: I0217 09:53:16.607001 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5v6mg" podUID="9ac6e362-0b8a-4f1a-8776-b1f4e01b724d" containerName="registry-server" containerID="cri-o://4271aec99abe43108b7874ce0e6e4b5e45e92cbbb0bfa4a29468637bf38a5895" gracePeriod=2 Feb 17 09:53:17 crc kubenswrapper[4848]: I0217 09:53:17.157786 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5v6mg" Feb 17 09:53:17 crc kubenswrapper[4848]: I0217 09:53:17.320349 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shjt5\" (UniqueName: \"kubernetes.io/projected/9ac6e362-0b8a-4f1a-8776-b1f4e01b724d-kube-api-access-shjt5\") pod \"9ac6e362-0b8a-4f1a-8776-b1f4e01b724d\" (UID: \"9ac6e362-0b8a-4f1a-8776-b1f4e01b724d\") " Feb 17 09:53:17 crc kubenswrapper[4848]: I0217 09:53:17.320681 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ac6e362-0b8a-4f1a-8776-b1f4e01b724d-catalog-content\") pod \"9ac6e362-0b8a-4f1a-8776-b1f4e01b724d\" (UID: \"9ac6e362-0b8a-4f1a-8776-b1f4e01b724d\") " Feb 17 09:53:17 crc kubenswrapper[4848]: I0217 09:53:17.321049 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ac6e362-0b8a-4f1a-8776-b1f4e01b724d-utilities\") pod \"9ac6e362-0b8a-4f1a-8776-b1f4e01b724d\" (UID: \"9ac6e362-0b8a-4f1a-8776-b1f4e01b724d\") " Feb 17 09:53:17 crc kubenswrapper[4848]: I0217 09:53:17.321741 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ac6e362-0b8a-4f1a-8776-b1f4e01b724d-utilities" (OuterVolumeSpecName: "utilities") pod "9ac6e362-0b8a-4f1a-8776-b1f4e01b724d" (UID: "9ac6e362-0b8a-4f1a-8776-b1f4e01b724d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:53:17 crc kubenswrapper[4848]: I0217 09:53:17.322179 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9ac6e362-0b8a-4f1a-8776-b1f4e01b724d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:53:17 crc kubenswrapper[4848]: I0217 09:53:17.331062 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ac6e362-0b8a-4f1a-8776-b1f4e01b724d-kube-api-access-shjt5" (OuterVolumeSpecName: "kube-api-access-shjt5") pod "9ac6e362-0b8a-4f1a-8776-b1f4e01b724d" (UID: "9ac6e362-0b8a-4f1a-8776-b1f4e01b724d"). InnerVolumeSpecName "kube-api-access-shjt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:53:17 crc kubenswrapper[4848]: I0217 09:53:17.400220 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ac6e362-0b8a-4f1a-8776-b1f4e01b724d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9ac6e362-0b8a-4f1a-8776-b1f4e01b724d" (UID: "9ac6e362-0b8a-4f1a-8776-b1f4e01b724d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:53:17 crc kubenswrapper[4848]: I0217 09:53:17.424752 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shjt5\" (UniqueName: \"kubernetes.io/projected/9ac6e362-0b8a-4f1a-8776-b1f4e01b724d-kube-api-access-shjt5\") on node \"crc\" DevicePath \"\"" Feb 17 09:53:17 crc kubenswrapper[4848]: I0217 09:53:17.424806 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9ac6e362-0b8a-4f1a-8776-b1f4e01b724d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:53:17 crc kubenswrapper[4848]: I0217 09:53:17.618859 4848 generic.go:334] "Generic (PLEG): container finished" podID="9ac6e362-0b8a-4f1a-8776-b1f4e01b724d" containerID="4271aec99abe43108b7874ce0e6e4b5e45e92cbbb0bfa4a29468637bf38a5895" exitCode=0 Feb 17 09:53:17 crc kubenswrapper[4848]: I0217 09:53:17.618918 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v6mg" event={"ID":"9ac6e362-0b8a-4f1a-8776-b1f4e01b724d","Type":"ContainerDied","Data":"4271aec99abe43108b7874ce0e6e4b5e45e92cbbb0bfa4a29468637bf38a5895"} Feb 17 09:53:17 crc kubenswrapper[4848]: I0217 09:53:17.618936 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5v6mg" Feb 17 09:53:17 crc kubenswrapper[4848]: I0217 09:53:17.618958 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5v6mg" event={"ID":"9ac6e362-0b8a-4f1a-8776-b1f4e01b724d","Type":"ContainerDied","Data":"17dd5068a7ae55268ebe83712440bac930ce34290324267c85c8d8eebf830ab3"} Feb 17 09:53:17 crc kubenswrapper[4848]: I0217 09:53:17.618989 4848 scope.go:117] "RemoveContainer" containerID="4271aec99abe43108b7874ce0e6e4b5e45e92cbbb0bfa4a29468637bf38a5895" Feb 17 09:53:17 crc kubenswrapper[4848]: I0217 09:53:17.640667 4848 scope.go:117] "RemoveContainer" containerID="12af020316140a9593fd592bc37f55b85d7fa531ea4edbe71315ce070c89baef" Feb 17 09:53:17 crc kubenswrapper[4848]: I0217 09:53:17.663294 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5v6mg"] Feb 17 09:53:17 crc kubenswrapper[4848]: I0217 09:53:17.675223 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5v6mg"] Feb 17 09:53:17 crc kubenswrapper[4848]: I0217 09:53:17.687088 4848 scope.go:117] "RemoveContainer" containerID="069af5d82ffa597a9f07ae6ddd9e951875f991d62be0f930a3685f3c6e1f4da1" Feb 17 09:53:17 crc kubenswrapper[4848]: I0217 09:53:17.711498 4848 scope.go:117] "RemoveContainer" containerID="4271aec99abe43108b7874ce0e6e4b5e45e92cbbb0bfa4a29468637bf38a5895" Feb 17 09:53:17 crc kubenswrapper[4848]: E0217 09:53:17.711988 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4271aec99abe43108b7874ce0e6e4b5e45e92cbbb0bfa4a29468637bf38a5895\": container with ID starting with 4271aec99abe43108b7874ce0e6e4b5e45e92cbbb0bfa4a29468637bf38a5895 not found: ID does not exist" containerID="4271aec99abe43108b7874ce0e6e4b5e45e92cbbb0bfa4a29468637bf38a5895" Feb 17 09:53:17 crc kubenswrapper[4848]: I0217 09:53:17.712025 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4271aec99abe43108b7874ce0e6e4b5e45e92cbbb0bfa4a29468637bf38a5895"} err="failed to get container status \"4271aec99abe43108b7874ce0e6e4b5e45e92cbbb0bfa4a29468637bf38a5895\": rpc error: code = NotFound desc = could not find container \"4271aec99abe43108b7874ce0e6e4b5e45e92cbbb0bfa4a29468637bf38a5895\": container with ID starting with 4271aec99abe43108b7874ce0e6e4b5e45e92cbbb0bfa4a29468637bf38a5895 not found: ID does not exist" Feb 17 09:53:17 crc kubenswrapper[4848]: I0217 09:53:17.712052 4848 scope.go:117] "RemoveContainer" containerID="12af020316140a9593fd592bc37f55b85d7fa531ea4edbe71315ce070c89baef" Feb 17 09:53:17 crc kubenswrapper[4848]: E0217 09:53:17.712552 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12af020316140a9593fd592bc37f55b85d7fa531ea4edbe71315ce070c89baef\": container with ID starting with 12af020316140a9593fd592bc37f55b85d7fa531ea4edbe71315ce070c89baef not found: ID does not exist" containerID="12af020316140a9593fd592bc37f55b85d7fa531ea4edbe71315ce070c89baef" Feb 17 09:53:17 crc kubenswrapper[4848]: I0217 09:53:17.712615 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12af020316140a9593fd592bc37f55b85d7fa531ea4edbe71315ce070c89baef"} err="failed to get container status \"12af020316140a9593fd592bc37f55b85d7fa531ea4edbe71315ce070c89baef\": rpc error: code = NotFound desc = could not find container \"12af020316140a9593fd592bc37f55b85d7fa531ea4edbe71315ce070c89baef\": container with ID starting with 12af020316140a9593fd592bc37f55b85d7fa531ea4edbe71315ce070c89baef not found: ID does not exist" Feb 17 09:53:17 crc kubenswrapper[4848]: I0217 09:53:17.712657 4848 scope.go:117] "RemoveContainer" containerID="069af5d82ffa597a9f07ae6ddd9e951875f991d62be0f930a3685f3c6e1f4da1" Feb 17 09:53:17 crc kubenswrapper[4848]: E0217 09:53:17.714393 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"069af5d82ffa597a9f07ae6ddd9e951875f991d62be0f930a3685f3c6e1f4da1\": container with ID starting with 069af5d82ffa597a9f07ae6ddd9e951875f991d62be0f930a3685f3c6e1f4da1 not found: ID does not exist" containerID="069af5d82ffa597a9f07ae6ddd9e951875f991d62be0f930a3685f3c6e1f4da1" Feb 17 09:53:17 crc kubenswrapper[4848]: I0217 09:53:17.714426 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"069af5d82ffa597a9f07ae6ddd9e951875f991d62be0f930a3685f3c6e1f4da1"} err="failed to get container status \"069af5d82ffa597a9f07ae6ddd9e951875f991d62be0f930a3685f3c6e1f4da1\": rpc error: code = NotFound desc = could not find container \"069af5d82ffa597a9f07ae6ddd9e951875f991d62be0f930a3685f3c6e1f4da1\": container with ID starting with 069af5d82ffa597a9f07ae6ddd9e951875f991d62be0f930a3685f3c6e1f4da1 not found: ID does not exist" Feb 17 09:53:18 crc kubenswrapper[4848]: I0217 09:53:18.771962 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:53:18 crc kubenswrapper[4848]: I0217 09:53:18.772205 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:53:19 crc kubenswrapper[4848]: I0217 09:53:19.396981 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ac6e362-0b8a-4f1a-8776-b1f4e01b724d" path="/var/lib/kubelet/pods/9ac6e362-0b8a-4f1a-8776-b1f4e01b724d/volumes" Feb 17 09:53:48 crc kubenswrapper[4848]: I0217 09:53:48.771803 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:53:48 crc kubenswrapper[4848]: I0217 09:53:48.772537 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:53:48 crc kubenswrapper[4848]: I0217 09:53:48.772594 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:53:48 crc kubenswrapper[4848]: I0217 09:53:48.773499 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"54b4c054c8795f93009bd8a79273fe782e5dc24d5d27896029a69001600777f4"} pod="openshift-machine-config-operator/machine-config-daemon-stvnz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 09:53:48 crc kubenswrapper[4848]: I0217 09:53:48.773594 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" containerID="cri-o://54b4c054c8795f93009bd8a79273fe782e5dc24d5d27896029a69001600777f4" gracePeriod=600 Feb 17 09:53:48 crc kubenswrapper[4848]: I0217 09:53:48.950729 4848 generic.go:334] "Generic (PLEG): container finished" podID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerID="54b4c054c8795f93009bd8a79273fe782e5dc24d5d27896029a69001600777f4" exitCode=0 Feb 17 09:53:48 crc kubenswrapper[4848]: I0217 09:53:48.950801 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerDied","Data":"54b4c054c8795f93009bd8a79273fe782e5dc24d5d27896029a69001600777f4"} Feb 17 09:53:48 crc kubenswrapper[4848]: I0217 09:53:48.950867 4848 scope.go:117] "RemoveContainer" containerID="b2666daf80e2b05948a3c6d58be237f04d1ed44cac8d3c9209b532cfabcafd29" Feb 17 09:53:49 crc kubenswrapper[4848]: I0217 09:53:49.964168 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerStarted","Data":"3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5"} Feb 17 09:55:21 crc kubenswrapper[4848]: I0217 09:55:21.439359 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bg8k7"] Feb 17 09:55:21 crc kubenswrapper[4848]: E0217 09:55:21.440400 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac6e362-0b8a-4f1a-8776-b1f4e01b724d" containerName="extract-utilities" Feb 17 09:55:21 crc kubenswrapper[4848]: I0217 09:55:21.440418 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac6e362-0b8a-4f1a-8776-b1f4e01b724d" containerName="extract-utilities" Feb 17 09:55:21 crc kubenswrapper[4848]: E0217 09:55:21.440436 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac6e362-0b8a-4f1a-8776-b1f4e01b724d" containerName="extract-content" Feb 17 09:55:21 crc kubenswrapper[4848]: I0217 09:55:21.440444 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac6e362-0b8a-4f1a-8776-b1f4e01b724d" containerName="extract-content" Feb 17 09:55:21 crc kubenswrapper[4848]: E0217 09:55:21.440467 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ac6e362-0b8a-4f1a-8776-b1f4e01b724d" containerName="registry-server" Feb 17 09:55:21 crc kubenswrapper[4848]: I0217 09:55:21.440476 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ac6e362-0b8a-4f1a-8776-b1f4e01b724d" containerName="registry-server" Feb 17 09:55:21 crc kubenswrapper[4848]: I0217 09:55:21.440710 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ac6e362-0b8a-4f1a-8776-b1f4e01b724d" containerName="registry-server" Feb 17 09:55:21 crc kubenswrapper[4848]: I0217 09:55:21.442265 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bg8k7" Feb 17 09:55:21 crc kubenswrapper[4848]: I0217 09:55:21.448612 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bg8k7"] Feb 17 09:55:21 crc kubenswrapper[4848]: I0217 09:55:21.503470 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2745521b-e088-4993-9517-efc21d7b8a15-catalog-content\") pod \"redhat-marketplace-bg8k7\" (UID: \"2745521b-e088-4993-9517-efc21d7b8a15\") " pod="openshift-marketplace/redhat-marketplace-bg8k7" Feb 17 09:55:21 crc kubenswrapper[4848]: I0217 09:55:21.503565 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2745521b-e088-4993-9517-efc21d7b8a15-utilities\") pod \"redhat-marketplace-bg8k7\" (UID: \"2745521b-e088-4993-9517-efc21d7b8a15\") " pod="openshift-marketplace/redhat-marketplace-bg8k7" Feb 17 09:55:21 crc kubenswrapper[4848]: I0217 09:55:21.503591 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2k9c\" (UniqueName: \"kubernetes.io/projected/2745521b-e088-4993-9517-efc21d7b8a15-kube-api-access-s2k9c\") pod \"redhat-marketplace-bg8k7\" (UID: \"2745521b-e088-4993-9517-efc21d7b8a15\") " pod="openshift-marketplace/redhat-marketplace-bg8k7" Feb 17 09:55:21 crc kubenswrapper[4848]: I0217 09:55:21.605969 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2745521b-e088-4993-9517-efc21d7b8a15-catalog-content\") pod \"redhat-marketplace-bg8k7\" (UID: \"2745521b-e088-4993-9517-efc21d7b8a15\") " pod="openshift-marketplace/redhat-marketplace-bg8k7" Feb 17 09:55:21 crc kubenswrapper[4848]: I0217 09:55:21.606048 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2745521b-e088-4993-9517-efc21d7b8a15-utilities\") pod \"redhat-marketplace-bg8k7\" (UID: \"2745521b-e088-4993-9517-efc21d7b8a15\") " pod="openshift-marketplace/redhat-marketplace-bg8k7" Feb 17 09:55:21 crc kubenswrapper[4848]: I0217 09:55:21.606071 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2k9c\" (UniqueName: \"kubernetes.io/projected/2745521b-e088-4993-9517-efc21d7b8a15-kube-api-access-s2k9c\") pod \"redhat-marketplace-bg8k7\" (UID: \"2745521b-e088-4993-9517-efc21d7b8a15\") " pod="openshift-marketplace/redhat-marketplace-bg8k7" Feb 17 09:55:21 crc kubenswrapper[4848]: I0217 09:55:21.606666 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2745521b-e088-4993-9517-efc21d7b8a15-utilities\") pod \"redhat-marketplace-bg8k7\" (UID: \"2745521b-e088-4993-9517-efc21d7b8a15\") " pod="openshift-marketplace/redhat-marketplace-bg8k7" Feb 17 09:55:21 crc kubenswrapper[4848]: I0217 09:55:21.606719 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2745521b-e088-4993-9517-efc21d7b8a15-catalog-content\") pod \"redhat-marketplace-bg8k7\" (UID: \"2745521b-e088-4993-9517-efc21d7b8a15\") " pod="openshift-marketplace/redhat-marketplace-bg8k7" Feb 17 09:55:21 crc kubenswrapper[4848]: I0217 09:55:21.630704 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2k9c\" (UniqueName: \"kubernetes.io/projected/2745521b-e088-4993-9517-efc21d7b8a15-kube-api-access-s2k9c\") pod \"redhat-marketplace-bg8k7\" (UID: \"2745521b-e088-4993-9517-efc21d7b8a15\") " pod="openshift-marketplace/redhat-marketplace-bg8k7" Feb 17 09:55:21 crc kubenswrapper[4848]: I0217 09:55:21.767556 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bg8k7" Feb 17 09:55:22 crc kubenswrapper[4848]: I0217 09:55:22.217484 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bg8k7"] Feb 17 09:55:22 crc kubenswrapper[4848]: I0217 09:55:22.922894 4848 generic.go:334] "Generic (PLEG): container finished" podID="2745521b-e088-4993-9517-efc21d7b8a15" containerID="d017c657a4689582d36e71568c692b3b1f0f790db118563e3f0e05ee3820fae8" exitCode=0 Feb 17 09:55:22 crc kubenswrapper[4848]: I0217 09:55:22.923005 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bg8k7" event={"ID":"2745521b-e088-4993-9517-efc21d7b8a15","Type":"ContainerDied","Data":"d017c657a4689582d36e71568c692b3b1f0f790db118563e3f0e05ee3820fae8"} Feb 17 09:55:22 crc kubenswrapper[4848]: I0217 09:55:22.923239 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bg8k7" event={"ID":"2745521b-e088-4993-9517-efc21d7b8a15","Type":"ContainerStarted","Data":"19acdbcb3b99b7472804780fece88decfab2818cf09d5d38d4f1bb86ef57b03c"} Feb 17 09:55:22 crc kubenswrapper[4848]: I0217 09:55:22.925235 4848 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 09:55:23 crc kubenswrapper[4848]: I0217 09:55:23.941383 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bg8k7" event={"ID":"2745521b-e088-4993-9517-efc21d7b8a15","Type":"ContainerStarted","Data":"d7c5b7417e18651a318b098587b5fcdb20e9eb7ceebf22e7d975df3f60dc1271"} Feb 17 09:55:24 crc kubenswrapper[4848]: I0217 09:55:24.953790 4848 generic.go:334] "Generic (PLEG): container finished" podID="2745521b-e088-4993-9517-efc21d7b8a15" containerID="d7c5b7417e18651a318b098587b5fcdb20e9eb7ceebf22e7d975df3f60dc1271" exitCode=0 Feb 17 09:55:24 crc kubenswrapper[4848]: I0217 09:55:24.953897 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bg8k7" event={"ID":"2745521b-e088-4993-9517-efc21d7b8a15","Type":"ContainerDied","Data":"d7c5b7417e18651a318b098587b5fcdb20e9eb7ceebf22e7d975df3f60dc1271"} Feb 17 09:55:26 crc kubenswrapper[4848]: I0217 09:55:26.974475 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bg8k7" event={"ID":"2745521b-e088-4993-9517-efc21d7b8a15","Type":"ContainerStarted","Data":"1250c8f51d087c91b3012da216e74deec8673415b223a9c04e5626ad9bd67433"} Feb 17 09:55:27 crc kubenswrapper[4848]: I0217 09:55:27.012442 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bg8k7" podStartSLOduration=3.215549929 podStartE2EDuration="6.012423802s" podCreationTimestamp="2026-02-17 09:55:21 +0000 UTC" firstStartedPulling="2026-02-17 09:55:22.924984181 +0000 UTC m=+3000.468239827" lastFinishedPulling="2026-02-17 09:55:25.721858044 +0000 UTC m=+3003.265113700" observedRunningTime="2026-02-17 09:55:27.007684358 +0000 UTC m=+3004.550940034" watchObservedRunningTime="2026-02-17 09:55:27.012423802 +0000 UTC m=+3004.555679448" Feb 17 09:55:31 crc kubenswrapper[4848]: I0217 09:55:31.768302 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bg8k7" Feb 17 09:55:31 crc kubenswrapper[4848]: I0217 09:55:31.768938 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bg8k7" Feb 17 09:55:31 crc kubenswrapper[4848]: I0217 09:55:31.826969 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bg8k7" Feb 17 09:55:32 crc kubenswrapper[4848]: I0217 09:55:32.106465 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bg8k7" Feb 17 09:55:32 crc kubenswrapper[4848]: I0217 09:55:32.172933 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bg8k7"] Feb 17 09:55:34 crc kubenswrapper[4848]: I0217 09:55:34.041007 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bg8k7" podUID="2745521b-e088-4993-9517-efc21d7b8a15" containerName="registry-server" containerID="cri-o://1250c8f51d087c91b3012da216e74deec8673415b223a9c04e5626ad9bd67433" gracePeriod=2 Feb 17 09:55:34 crc kubenswrapper[4848]: I0217 09:55:34.508963 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bg8k7" Feb 17 09:55:34 crc kubenswrapper[4848]: I0217 09:55:34.589369 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2k9c\" (UniqueName: \"kubernetes.io/projected/2745521b-e088-4993-9517-efc21d7b8a15-kube-api-access-s2k9c\") pod \"2745521b-e088-4993-9517-efc21d7b8a15\" (UID: \"2745521b-e088-4993-9517-efc21d7b8a15\") " Feb 17 09:55:34 crc kubenswrapper[4848]: I0217 09:55:34.589869 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2745521b-e088-4993-9517-efc21d7b8a15-utilities\") pod \"2745521b-e088-4993-9517-efc21d7b8a15\" (UID: \"2745521b-e088-4993-9517-efc21d7b8a15\") " Feb 17 09:55:34 crc kubenswrapper[4848]: I0217 09:55:34.589998 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2745521b-e088-4993-9517-efc21d7b8a15-catalog-content\") pod \"2745521b-e088-4993-9517-efc21d7b8a15\" (UID: \"2745521b-e088-4993-9517-efc21d7b8a15\") " Feb 17 09:55:34 crc kubenswrapper[4848]: I0217 09:55:34.591409 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2745521b-e088-4993-9517-efc21d7b8a15-utilities" (OuterVolumeSpecName: "utilities") pod "2745521b-e088-4993-9517-efc21d7b8a15" (UID: "2745521b-e088-4993-9517-efc21d7b8a15"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:55:34 crc kubenswrapper[4848]: I0217 09:55:34.595028 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2745521b-e088-4993-9517-efc21d7b8a15-kube-api-access-s2k9c" (OuterVolumeSpecName: "kube-api-access-s2k9c") pod "2745521b-e088-4993-9517-efc21d7b8a15" (UID: "2745521b-e088-4993-9517-efc21d7b8a15"). InnerVolumeSpecName "kube-api-access-s2k9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:55:34 crc kubenswrapper[4848]: I0217 09:55:34.664666 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2745521b-e088-4993-9517-efc21d7b8a15-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2745521b-e088-4993-9517-efc21d7b8a15" (UID: "2745521b-e088-4993-9517-efc21d7b8a15"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:55:34 crc kubenswrapper[4848]: I0217 09:55:34.692160 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2k9c\" (UniqueName: \"kubernetes.io/projected/2745521b-e088-4993-9517-efc21d7b8a15-kube-api-access-s2k9c\") on node \"crc\" DevicePath \"\"" Feb 17 09:55:34 crc kubenswrapper[4848]: I0217 09:55:34.692195 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2745521b-e088-4993-9517-efc21d7b8a15-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 09:55:34 crc kubenswrapper[4848]: I0217 09:55:34.692206 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2745521b-e088-4993-9517-efc21d7b8a15-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 09:55:35 crc kubenswrapper[4848]: I0217 09:55:35.048459 4848 generic.go:334] "Generic (PLEG): container finished" podID="2745521b-e088-4993-9517-efc21d7b8a15" containerID="1250c8f51d087c91b3012da216e74deec8673415b223a9c04e5626ad9bd67433" exitCode=0 Feb 17 09:55:35 crc kubenswrapper[4848]: I0217 09:55:35.048504 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bg8k7" event={"ID":"2745521b-e088-4993-9517-efc21d7b8a15","Type":"ContainerDied","Data":"1250c8f51d087c91b3012da216e74deec8673415b223a9c04e5626ad9bd67433"} Feb 17 09:55:35 crc kubenswrapper[4848]: I0217 09:55:35.048530 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bg8k7" event={"ID":"2745521b-e088-4993-9517-efc21d7b8a15","Type":"ContainerDied","Data":"19acdbcb3b99b7472804780fece88decfab2818cf09d5d38d4f1bb86ef57b03c"} Feb 17 09:55:35 crc kubenswrapper[4848]: I0217 09:55:35.048542 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bg8k7" Feb 17 09:55:35 crc kubenswrapper[4848]: I0217 09:55:35.048563 4848 scope.go:117] "RemoveContainer" containerID="1250c8f51d087c91b3012da216e74deec8673415b223a9c04e5626ad9bd67433" Feb 17 09:55:35 crc kubenswrapper[4848]: I0217 09:55:35.073746 4848 scope.go:117] "RemoveContainer" containerID="d7c5b7417e18651a318b098587b5fcdb20e9eb7ceebf22e7d975df3f60dc1271" Feb 17 09:55:35 crc kubenswrapper[4848]: I0217 09:55:35.097588 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bg8k7"] Feb 17 09:55:35 crc kubenswrapper[4848]: I0217 09:55:35.104536 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bg8k7"] Feb 17 09:55:35 crc kubenswrapper[4848]: I0217 09:55:35.107964 4848 scope.go:117] "RemoveContainer" containerID="d017c657a4689582d36e71568c692b3b1f0f790db118563e3f0e05ee3820fae8" Feb 17 09:55:35 crc kubenswrapper[4848]: I0217 09:55:35.153658 4848 scope.go:117] "RemoveContainer" containerID="1250c8f51d087c91b3012da216e74deec8673415b223a9c04e5626ad9bd67433" Feb 17 09:55:35 crc kubenswrapper[4848]: E0217 09:55:35.154351 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1250c8f51d087c91b3012da216e74deec8673415b223a9c04e5626ad9bd67433\": container with ID starting with 1250c8f51d087c91b3012da216e74deec8673415b223a9c04e5626ad9bd67433 not found: ID does not exist" containerID="1250c8f51d087c91b3012da216e74deec8673415b223a9c04e5626ad9bd67433" Feb 17 09:55:35 crc kubenswrapper[4848]: I0217 09:55:35.154402 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1250c8f51d087c91b3012da216e74deec8673415b223a9c04e5626ad9bd67433"} err="failed to get container status \"1250c8f51d087c91b3012da216e74deec8673415b223a9c04e5626ad9bd67433\": rpc error: code = NotFound desc = could not find container \"1250c8f51d087c91b3012da216e74deec8673415b223a9c04e5626ad9bd67433\": container with ID starting with 1250c8f51d087c91b3012da216e74deec8673415b223a9c04e5626ad9bd67433 not found: ID does not exist" Feb 17 09:55:35 crc kubenswrapper[4848]: I0217 09:55:35.154441 4848 scope.go:117] "RemoveContainer" containerID="d7c5b7417e18651a318b098587b5fcdb20e9eb7ceebf22e7d975df3f60dc1271" Feb 17 09:55:35 crc kubenswrapper[4848]: E0217 09:55:35.154662 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7c5b7417e18651a318b098587b5fcdb20e9eb7ceebf22e7d975df3f60dc1271\": container with ID starting with d7c5b7417e18651a318b098587b5fcdb20e9eb7ceebf22e7d975df3f60dc1271 not found: ID does not exist" containerID="d7c5b7417e18651a318b098587b5fcdb20e9eb7ceebf22e7d975df3f60dc1271" Feb 17 09:55:35 crc kubenswrapper[4848]: I0217 09:55:35.154697 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c5b7417e18651a318b098587b5fcdb20e9eb7ceebf22e7d975df3f60dc1271"} err="failed to get container status \"d7c5b7417e18651a318b098587b5fcdb20e9eb7ceebf22e7d975df3f60dc1271\": rpc error: code = NotFound desc = could not find container \"d7c5b7417e18651a318b098587b5fcdb20e9eb7ceebf22e7d975df3f60dc1271\": container with ID starting with d7c5b7417e18651a318b098587b5fcdb20e9eb7ceebf22e7d975df3f60dc1271 not found: ID does not exist" Feb 17 09:55:35 crc kubenswrapper[4848]: I0217 09:55:35.154720 4848 scope.go:117] "RemoveContainer" containerID="d017c657a4689582d36e71568c692b3b1f0f790db118563e3f0e05ee3820fae8" Feb 17 09:55:35 crc kubenswrapper[4848]: E0217 09:55:35.155155 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d017c657a4689582d36e71568c692b3b1f0f790db118563e3f0e05ee3820fae8\": container with ID starting with d017c657a4689582d36e71568c692b3b1f0f790db118563e3f0e05ee3820fae8 not found: ID does not exist" containerID="d017c657a4689582d36e71568c692b3b1f0f790db118563e3f0e05ee3820fae8" Feb 17 09:55:35 crc kubenswrapper[4848]: I0217 09:55:35.155187 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d017c657a4689582d36e71568c692b3b1f0f790db118563e3f0e05ee3820fae8"} err="failed to get container status \"d017c657a4689582d36e71568c692b3b1f0f790db118563e3f0e05ee3820fae8\": rpc error: code = NotFound desc = could not find container \"d017c657a4689582d36e71568c692b3b1f0f790db118563e3f0e05ee3820fae8\": container with ID starting with d017c657a4689582d36e71568c692b3b1f0f790db118563e3f0e05ee3820fae8 not found: ID does not exist" Feb 17 09:55:35 crc kubenswrapper[4848]: I0217 09:55:35.430054 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2745521b-e088-4993-9517-efc21d7b8a15" path="/var/lib/kubelet/pods/2745521b-e088-4993-9517-efc21d7b8a15/volumes" Feb 17 09:56:18 crc kubenswrapper[4848]: I0217 09:56:18.771944 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:56:18 crc kubenswrapper[4848]: I0217 09:56:18.772577 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:56:48 crc kubenswrapper[4848]: I0217 09:56:48.771842 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:56:48 crc kubenswrapper[4848]: I0217 09:56:48.772633 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:56:50 crc kubenswrapper[4848]: I0217 09:56:50.819873 4848 generic.go:334] "Generic (PLEG): container finished" podID="d195d00b-8819-4a35-9e3c-6b4b21660400" containerID="334ba25df13b52868e3da44b549cd876738861fea404ae247cd920cf80a1717f" exitCode=0 Feb 17 09:56:50 crc kubenswrapper[4848]: I0217 09:56:50.820394 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d195d00b-8819-4a35-9e3c-6b4b21660400","Type":"ContainerDied","Data":"334ba25df13b52868e3da44b549cd876738861fea404ae247cd920cf80a1717f"} Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.230565 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.355707 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d195d00b-8819-4a35-9e3c-6b4b21660400-openstack-config\") pod \"d195d00b-8819-4a35-9e3c-6b4b21660400\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.355866 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d195d00b-8819-4a35-9e3c-6b4b21660400-ca-certs\") pod \"d195d00b-8819-4a35-9e3c-6b4b21660400\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.355907 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"d195d00b-8819-4a35-9e3c-6b4b21660400\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.356029 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d195d00b-8819-4a35-9e3c-6b4b21660400-config-data\") pod \"d195d00b-8819-4a35-9e3c-6b4b21660400\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.356095 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqvlg\" (UniqueName: \"kubernetes.io/projected/d195d00b-8819-4a35-9e3c-6b4b21660400-kube-api-access-lqvlg\") pod \"d195d00b-8819-4a35-9e3c-6b4b21660400\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.356145 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d195d00b-8819-4a35-9e3c-6b4b21660400-test-operator-ephemeral-workdir\") pod \"d195d00b-8819-4a35-9e3c-6b4b21660400\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.356187 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d195d00b-8819-4a35-9e3c-6b4b21660400-ssh-key\") pod \"d195d00b-8819-4a35-9e3c-6b4b21660400\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.356259 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d195d00b-8819-4a35-9e3c-6b4b21660400-openstack-config-secret\") pod \"d195d00b-8819-4a35-9e3c-6b4b21660400\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.356388 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d195d00b-8819-4a35-9e3c-6b4b21660400-test-operator-ephemeral-temporary\") pod \"d195d00b-8819-4a35-9e3c-6b4b21660400\" (UID: \"d195d00b-8819-4a35-9e3c-6b4b21660400\") " Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.357451 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d195d00b-8819-4a35-9e3c-6b4b21660400-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "d195d00b-8819-4a35-9e3c-6b4b21660400" (UID: "d195d00b-8819-4a35-9e3c-6b4b21660400"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.357470 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d195d00b-8819-4a35-9e3c-6b4b21660400-config-data" (OuterVolumeSpecName: "config-data") pod "d195d00b-8819-4a35-9e3c-6b4b21660400" (UID: "d195d00b-8819-4a35-9e3c-6b4b21660400"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.361163 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d195d00b-8819-4a35-9e3c-6b4b21660400-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "d195d00b-8819-4a35-9e3c-6b4b21660400" (UID: "d195d00b-8819-4a35-9e3c-6b4b21660400"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.364139 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "test-operator-logs") pod "d195d00b-8819-4a35-9e3c-6b4b21660400" (UID: "d195d00b-8819-4a35-9e3c-6b4b21660400"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.364877 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d195d00b-8819-4a35-9e3c-6b4b21660400-kube-api-access-lqvlg" (OuterVolumeSpecName: "kube-api-access-lqvlg") pod "d195d00b-8819-4a35-9e3c-6b4b21660400" (UID: "d195d00b-8819-4a35-9e3c-6b4b21660400"). InnerVolumeSpecName "kube-api-access-lqvlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.393120 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d195d00b-8819-4a35-9e3c-6b4b21660400-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "d195d00b-8819-4a35-9e3c-6b4b21660400" (UID: "d195d00b-8819-4a35-9e3c-6b4b21660400"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.393819 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d195d00b-8819-4a35-9e3c-6b4b21660400-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d195d00b-8819-4a35-9e3c-6b4b21660400" (UID: "d195d00b-8819-4a35-9e3c-6b4b21660400"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.394341 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d195d00b-8819-4a35-9e3c-6b4b21660400-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "d195d00b-8819-4a35-9e3c-6b4b21660400" (UID: "d195d00b-8819-4a35-9e3c-6b4b21660400"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.423535 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d195d00b-8819-4a35-9e3c-6b4b21660400-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "d195d00b-8819-4a35-9e3c-6b4b21660400" (UID: "d195d00b-8819-4a35-9e3c-6b4b21660400"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.459266 4848 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d195d00b-8819-4a35-9e3c-6b4b21660400-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.459363 4848 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.459379 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d195d00b-8819-4a35-9e3c-6b4b21660400-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.459393 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqvlg\" (UniqueName: \"kubernetes.io/projected/d195d00b-8819-4a35-9e3c-6b4b21660400-kube-api-access-lqvlg\") on node \"crc\" DevicePath \"\"" Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.459409 4848 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d195d00b-8819-4a35-9e3c-6b4b21660400-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.459420 4848 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d195d00b-8819-4a35-9e3c-6b4b21660400-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.459431 4848 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d195d00b-8819-4a35-9e3c-6b4b21660400-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.459446 4848 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d195d00b-8819-4a35-9e3c-6b4b21660400-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.459458 4848 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d195d00b-8819-4a35-9e3c-6b4b21660400-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.481402 4848 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.562128 4848 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.843316 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d195d00b-8819-4a35-9e3c-6b4b21660400","Type":"ContainerDied","Data":"5a4b0bef0e7efddad88a208a71632fb466bf099068b1511d28a70faa4be4a84f"} Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.843355 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a4b0bef0e7efddad88a208a71632fb466bf099068b1511d28a70faa4be4a84f" Feb 17 09:56:52 crc kubenswrapper[4848]: I0217 09:56:52.843454 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 17 09:57:01 crc kubenswrapper[4848]: I0217 09:57:01.077200 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 17 09:57:01 crc kubenswrapper[4848]: E0217 09:57:01.079039 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d195d00b-8819-4a35-9e3c-6b4b21660400" containerName="tempest-tests-tempest-tests-runner" Feb 17 09:57:01 crc kubenswrapper[4848]: I0217 09:57:01.079063 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="d195d00b-8819-4a35-9e3c-6b4b21660400" containerName="tempest-tests-tempest-tests-runner" Feb 17 09:57:01 crc kubenswrapper[4848]: E0217 09:57:01.079080 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2745521b-e088-4993-9517-efc21d7b8a15" containerName="registry-server" Feb 17 09:57:01 crc kubenswrapper[4848]: I0217 09:57:01.079088 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2745521b-e088-4993-9517-efc21d7b8a15" containerName="registry-server" Feb 17 09:57:01 crc kubenswrapper[4848]: E0217 09:57:01.079106 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2745521b-e088-4993-9517-efc21d7b8a15" containerName="extract-utilities" Feb 17 09:57:01 crc kubenswrapper[4848]: I0217 09:57:01.079117 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2745521b-e088-4993-9517-efc21d7b8a15" containerName="extract-utilities" Feb 17 09:57:01 crc kubenswrapper[4848]: E0217 09:57:01.079135 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2745521b-e088-4993-9517-efc21d7b8a15" containerName="extract-content" Feb 17 09:57:01 crc kubenswrapper[4848]: I0217 09:57:01.079142 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2745521b-e088-4993-9517-efc21d7b8a15" containerName="extract-content" Feb 17 09:57:01 crc kubenswrapper[4848]: I0217 09:57:01.079409 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="d195d00b-8819-4a35-9e3c-6b4b21660400" containerName="tempest-tests-tempest-tests-runner" Feb 17 09:57:01 crc kubenswrapper[4848]: I0217 09:57:01.079420 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2745521b-e088-4993-9517-efc21d7b8a15" containerName="registry-server" Feb 17 09:57:01 crc kubenswrapper[4848]: I0217 09:57:01.080451 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 09:57:01 crc kubenswrapper[4848]: I0217 09:57:01.084880 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-bz9ck" Feb 17 09:57:01 crc kubenswrapper[4848]: I0217 09:57:01.107069 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 17 09:57:01 crc kubenswrapper[4848]: I0217 09:57:01.227354 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c3f9f72f-9857-440a-a108-80d6b424f2a3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 09:57:01 crc kubenswrapper[4848]: I0217 09:57:01.227447 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdfmd\" (UniqueName: \"kubernetes.io/projected/c3f9f72f-9857-440a-a108-80d6b424f2a3-kube-api-access-tdfmd\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c3f9f72f-9857-440a-a108-80d6b424f2a3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 09:57:01 crc kubenswrapper[4848]: I0217 09:57:01.330121 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c3f9f72f-9857-440a-a108-80d6b424f2a3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 09:57:01 crc kubenswrapper[4848]: I0217 09:57:01.330253 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdfmd\" (UniqueName: \"kubernetes.io/projected/c3f9f72f-9857-440a-a108-80d6b424f2a3-kube-api-access-tdfmd\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c3f9f72f-9857-440a-a108-80d6b424f2a3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 09:57:01 crc kubenswrapper[4848]: I0217 09:57:01.330680 4848 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c3f9f72f-9857-440a-a108-80d6b424f2a3\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 09:57:01 crc kubenswrapper[4848]: I0217 09:57:01.354657 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdfmd\" (UniqueName: \"kubernetes.io/projected/c3f9f72f-9857-440a-a108-80d6b424f2a3-kube-api-access-tdfmd\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c3f9f72f-9857-440a-a108-80d6b424f2a3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 09:57:01 crc kubenswrapper[4848]: I0217 09:57:01.357104 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"c3f9f72f-9857-440a-a108-80d6b424f2a3\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 09:57:01 crc kubenswrapper[4848]: I0217 09:57:01.410198 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 17 09:57:01 crc kubenswrapper[4848]: I0217 09:57:01.925084 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 17 09:57:02 crc kubenswrapper[4848]: I0217 09:57:02.960606 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"c3f9f72f-9857-440a-a108-80d6b424f2a3","Type":"ContainerStarted","Data":"16700256cd6eeb630d2d2c2e2d1361ddef2e4031de6cc5d35d10e2dad933fb39"} Feb 17 09:57:03 crc kubenswrapper[4848]: I0217 09:57:03.970245 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"c3f9f72f-9857-440a-a108-80d6b424f2a3","Type":"ContainerStarted","Data":"2bc3d606fb12c247df29497c161e31ac8d640bc5ddf475bdbc56ffa51482faa1"} Feb 17 09:57:03 crc kubenswrapper[4848]: I0217 09:57:03.996835 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.198459193 podStartE2EDuration="2.99681248s" podCreationTimestamp="2026-02-17 09:57:01 +0000 UTC" firstStartedPulling="2026-02-17 09:57:01.935270483 +0000 UTC m=+3099.478526129" lastFinishedPulling="2026-02-17 09:57:02.73362375 +0000 UTC m=+3100.276879416" observedRunningTime="2026-02-17 09:57:03.98869117 +0000 UTC m=+3101.531946816" watchObservedRunningTime="2026-02-17 09:57:03.99681248 +0000 UTC m=+3101.540068126" Feb 17 09:57:18 crc kubenswrapper[4848]: I0217 09:57:18.772053 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 09:57:18 crc kubenswrapper[4848]: I0217 09:57:18.772660 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 09:57:18 crc kubenswrapper[4848]: I0217 09:57:18.772720 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 09:57:18 crc kubenswrapper[4848]: I0217 09:57:18.773612 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5"} pod="openshift-machine-config-operator/machine-config-daemon-stvnz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 09:57:18 crc kubenswrapper[4848]: I0217 09:57:18.773685 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" containerID="cri-o://3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" gracePeriod=600 Feb 17 09:57:18 crc kubenswrapper[4848]: E0217 09:57:18.979429 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:57:19 crc kubenswrapper[4848]: I0217 09:57:19.132966 4848 generic.go:334] "Generic (PLEG): container finished" podID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" exitCode=0 Feb 17 09:57:19 crc kubenswrapper[4848]: I0217 09:57:19.133023 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerDied","Data":"3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5"} Feb 17 09:57:19 crc kubenswrapper[4848]: I0217 09:57:19.133064 4848 scope.go:117] "RemoveContainer" containerID="54b4c054c8795f93009bd8a79273fe782e5dc24d5d27896029a69001600777f4" Feb 17 09:57:19 crc kubenswrapper[4848]: I0217 09:57:19.133777 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 09:57:19 crc kubenswrapper[4848]: E0217 09:57:19.134097 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:57:24 crc kubenswrapper[4848]: I0217 09:57:24.656938 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zndr7/must-gather-2cxt8"] Feb 17 09:57:24 crc kubenswrapper[4848]: I0217 09:57:24.660026 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zndr7/must-gather-2cxt8" Feb 17 09:57:24 crc kubenswrapper[4848]: I0217 09:57:24.664702 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-zndr7"/"default-dockercfg-lnvqc" Feb 17 09:57:24 crc kubenswrapper[4848]: I0217 09:57:24.664852 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zndr7"/"openshift-service-ca.crt" Feb 17 09:57:24 crc kubenswrapper[4848]: I0217 09:57:24.665209 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zndr7"/"kube-root-ca.crt" Feb 17 09:57:24 crc kubenswrapper[4848]: I0217 09:57:24.670905 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zndr7/must-gather-2cxt8"] Feb 17 09:57:24 crc kubenswrapper[4848]: I0217 09:57:24.848081 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw2k2\" (UniqueName: \"kubernetes.io/projected/0be33e07-83c0-4b4d-b24c-3b9c98467671-kube-api-access-gw2k2\") pod \"must-gather-2cxt8\" (UID: \"0be33e07-83c0-4b4d-b24c-3b9c98467671\") " pod="openshift-must-gather-zndr7/must-gather-2cxt8" Feb 17 09:57:24 crc kubenswrapper[4848]: I0217 09:57:24.848224 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0be33e07-83c0-4b4d-b24c-3b9c98467671-must-gather-output\") pod \"must-gather-2cxt8\" (UID: \"0be33e07-83c0-4b4d-b24c-3b9c98467671\") " pod="openshift-must-gather-zndr7/must-gather-2cxt8" Feb 17 09:57:24 crc kubenswrapper[4848]: I0217 09:57:24.950497 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw2k2\" (UniqueName: \"kubernetes.io/projected/0be33e07-83c0-4b4d-b24c-3b9c98467671-kube-api-access-gw2k2\") pod \"must-gather-2cxt8\" (UID: \"0be33e07-83c0-4b4d-b24c-3b9c98467671\") " pod="openshift-must-gather-zndr7/must-gather-2cxt8" Feb 17 09:57:24 crc kubenswrapper[4848]: I0217 09:57:24.950892 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0be33e07-83c0-4b4d-b24c-3b9c98467671-must-gather-output\") pod \"must-gather-2cxt8\" (UID: \"0be33e07-83c0-4b4d-b24c-3b9c98467671\") " pod="openshift-must-gather-zndr7/must-gather-2cxt8" Feb 17 09:57:24 crc kubenswrapper[4848]: I0217 09:57:24.951258 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0be33e07-83c0-4b4d-b24c-3b9c98467671-must-gather-output\") pod \"must-gather-2cxt8\" (UID: \"0be33e07-83c0-4b4d-b24c-3b9c98467671\") " pod="openshift-must-gather-zndr7/must-gather-2cxt8" Feb 17 09:57:24 crc kubenswrapper[4848]: I0217 09:57:24.986823 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw2k2\" (UniqueName: \"kubernetes.io/projected/0be33e07-83c0-4b4d-b24c-3b9c98467671-kube-api-access-gw2k2\") pod \"must-gather-2cxt8\" (UID: \"0be33e07-83c0-4b4d-b24c-3b9c98467671\") " pod="openshift-must-gather-zndr7/must-gather-2cxt8" Feb 17 09:57:25 crc kubenswrapper[4848]: I0217 09:57:25.287531 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zndr7/must-gather-2cxt8" Feb 17 09:57:25 crc kubenswrapper[4848]: I0217 09:57:25.808217 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zndr7/must-gather-2cxt8"] Feb 17 09:57:26 crc kubenswrapper[4848]: I0217 09:57:26.202340 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zndr7/must-gather-2cxt8" event={"ID":"0be33e07-83c0-4b4d-b24c-3b9c98467671","Type":"ContainerStarted","Data":"599f9384bc709f8c4ff56bd582cfeab27ab71b1677846b9bf023dff07099e120"} Feb 17 09:57:32 crc kubenswrapper[4848]: I0217 09:57:32.253799 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zndr7/must-gather-2cxt8" event={"ID":"0be33e07-83c0-4b4d-b24c-3b9c98467671","Type":"ContainerStarted","Data":"2ba7a47f6994e09b85c08c06328e9151c2e22c9bbbeb43fdbf9b875b5025300e"} Feb 17 09:57:32 crc kubenswrapper[4848]: I0217 09:57:32.254375 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zndr7/must-gather-2cxt8" event={"ID":"0be33e07-83c0-4b4d-b24c-3b9c98467671","Type":"ContainerStarted","Data":"d33a65e48d72e651fb49d9ed7d1b745f59c47927bc23b8032ab0bfd5d48114d9"} Feb 17 09:57:32 crc kubenswrapper[4848]: I0217 09:57:32.273680 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zndr7/must-gather-2cxt8" podStartSLOduration=2.421850654 podStartE2EDuration="8.273659041s" podCreationTimestamp="2026-02-17 09:57:24 +0000 UTC" firstStartedPulling="2026-02-17 09:57:25.820811901 +0000 UTC m=+3123.364067547" lastFinishedPulling="2026-02-17 09:57:31.672620278 +0000 UTC m=+3129.215875934" observedRunningTime="2026-02-17 09:57:32.269247856 +0000 UTC m=+3129.812503512" watchObservedRunningTime="2026-02-17 09:57:32.273659041 +0000 UTC m=+3129.816914697" Feb 17 09:57:33 crc kubenswrapper[4848]: I0217 09:57:33.389911 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 09:57:33 crc kubenswrapper[4848]: E0217 09:57:33.390689 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:57:35 crc kubenswrapper[4848]: I0217 09:57:35.678972 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zndr7/crc-debug-7jfqw"] Feb 17 09:57:35 crc kubenswrapper[4848]: I0217 09:57:35.680852 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zndr7/crc-debug-7jfqw" Feb 17 09:57:35 crc kubenswrapper[4848]: I0217 09:57:35.860136 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4rzl\" (UniqueName: \"kubernetes.io/projected/99c6820c-4699-446a-9bfe-4fc4a6c83630-kube-api-access-n4rzl\") pod \"crc-debug-7jfqw\" (UID: \"99c6820c-4699-446a-9bfe-4fc4a6c83630\") " pod="openshift-must-gather-zndr7/crc-debug-7jfqw" Feb 17 09:57:35 crc kubenswrapper[4848]: I0217 09:57:35.860240 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99c6820c-4699-446a-9bfe-4fc4a6c83630-host\") pod \"crc-debug-7jfqw\" (UID: \"99c6820c-4699-446a-9bfe-4fc4a6c83630\") " pod="openshift-must-gather-zndr7/crc-debug-7jfqw" Feb 17 09:57:35 crc kubenswrapper[4848]: I0217 09:57:35.962421 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99c6820c-4699-446a-9bfe-4fc4a6c83630-host\") pod \"crc-debug-7jfqw\" (UID: \"99c6820c-4699-446a-9bfe-4fc4a6c83630\") " pod="openshift-must-gather-zndr7/crc-debug-7jfqw" Feb 17 09:57:35 crc kubenswrapper[4848]: I0217 09:57:35.962598 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99c6820c-4699-446a-9bfe-4fc4a6c83630-host\") pod \"crc-debug-7jfqw\" (UID: \"99c6820c-4699-446a-9bfe-4fc4a6c83630\") " pod="openshift-must-gather-zndr7/crc-debug-7jfqw" Feb 17 09:57:35 crc kubenswrapper[4848]: I0217 09:57:35.962618 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4rzl\" (UniqueName: \"kubernetes.io/projected/99c6820c-4699-446a-9bfe-4fc4a6c83630-kube-api-access-n4rzl\") pod \"crc-debug-7jfqw\" (UID: \"99c6820c-4699-446a-9bfe-4fc4a6c83630\") " pod="openshift-must-gather-zndr7/crc-debug-7jfqw" Feb 17 09:57:35 crc kubenswrapper[4848]: I0217 09:57:35.982506 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4rzl\" (UniqueName: \"kubernetes.io/projected/99c6820c-4699-446a-9bfe-4fc4a6c83630-kube-api-access-n4rzl\") pod \"crc-debug-7jfqw\" (UID: \"99c6820c-4699-446a-9bfe-4fc4a6c83630\") " pod="openshift-must-gather-zndr7/crc-debug-7jfqw" Feb 17 09:57:35 crc kubenswrapper[4848]: I0217 09:57:35.998364 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zndr7/crc-debug-7jfqw" Feb 17 09:57:36 crc kubenswrapper[4848]: W0217 09:57:36.052544 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99c6820c_4699_446a_9bfe_4fc4a6c83630.slice/crio-39a9f15c18472f3063ecc483f5cf207ef4734df97f8044b069e72976dda96a08 WatchSource:0}: Error finding container 39a9f15c18472f3063ecc483f5cf207ef4734df97f8044b069e72976dda96a08: Status 404 returned error can't find the container with id 39a9f15c18472f3063ecc483f5cf207ef4734df97f8044b069e72976dda96a08 Feb 17 09:57:36 crc kubenswrapper[4848]: I0217 09:57:36.289602 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zndr7/crc-debug-7jfqw" event={"ID":"99c6820c-4699-446a-9bfe-4fc4a6c83630","Type":"ContainerStarted","Data":"39a9f15c18472f3063ecc483f5cf207ef4734df97f8044b069e72976dda96a08"} Feb 17 09:57:47 crc kubenswrapper[4848]: I0217 09:57:47.383453 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 09:57:47 crc kubenswrapper[4848]: E0217 09:57:47.384082 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:57:48 crc kubenswrapper[4848]: I0217 09:57:48.406284 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zndr7/crc-debug-7jfqw" event={"ID":"99c6820c-4699-446a-9bfe-4fc4a6c83630","Type":"ContainerStarted","Data":"8f6717ae42baab431754271d57fec65e690202617cf6eae950dc50929a27c255"} Feb 17 09:57:58 crc kubenswrapper[4848]: I0217 09:57:58.383052 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 09:57:58 crc kubenswrapper[4848]: E0217 09:57:58.383864 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:58:09 crc kubenswrapper[4848]: I0217 09:58:09.384429 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 09:58:09 crc kubenswrapper[4848]: E0217 09:58:09.385295 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:58:22 crc kubenswrapper[4848]: I0217 09:58:22.384904 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 09:58:22 crc kubenswrapper[4848]: E0217 09:58:22.386248 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:58:28 crc kubenswrapper[4848]: I0217 09:58:28.805887 4848 generic.go:334] "Generic (PLEG): container finished" podID="99c6820c-4699-446a-9bfe-4fc4a6c83630" containerID="8f6717ae42baab431754271d57fec65e690202617cf6eae950dc50929a27c255" exitCode=0 Feb 17 09:58:28 crc kubenswrapper[4848]: I0217 09:58:28.806478 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zndr7/crc-debug-7jfqw" event={"ID":"99c6820c-4699-446a-9bfe-4fc4a6c83630","Type":"ContainerDied","Data":"8f6717ae42baab431754271d57fec65e690202617cf6eae950dc50929a27c255"} Feb 17 09:58:29 crc kubenswrapper[4848]: I0217 09:58:29.919235 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zndr7/crc-debug-7jfqw" Feb 17 09:58:29 crc kubenswrapper[4848]: I0217 09:58:29.951327 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zndr7/crc-debug-7jfqw"] Feb 17 09:58:29 crc kubenswrapper[4848]: I0217 09:58:29.959700 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4rzl\" (UniqueName: \"kubernetes.io/projected/99c6820c-4699-446a-9bfe-4fc4a6c83630-kube-api-access-n4rzl\") pod \"99c6820c-4699-446a-9bfe-4fc4a6c83630\" (UID: \"99c6820c-4699-446a-9bfe-4fc4a6c83630\") " Feb 17 09:58:29 crc kubenswrapper[4848]: I0217 09:58:29.960082 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99c6820c-4699-446a-9bfe-4fc4a6c83630-host\") pod \"99c6820c-4699-446a-9bfe-4fc4a6c83630\" (UID: \"99c6820c-4699-446a-9bfe-4fc4a6c83630\") " Feb 17 09:58:29 crc kubenswrapper[4848]: I0217 09:58:29.960189 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/99c6820c-4699-446a-9bfe-4fc4a6c83630-host" (OuterVolumeSpecName: "host") pod "99c6820c-4699-446a-9bfe-4fc4a6c83630" (UID: "99c6820c-4699-446a-9bfe-4fc4a6c83630"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:58:29 crc kubenswrapper[4848]: I0217 09:58:29.960809 4848 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99c6820c-4699-446a-9bfe-4fc4a6c83630-host\") on node \"crc\" DevicePath \"\"" Feb 17 09:58:29 crc kubenswrapper[4848]: I0217 09:58:29.967012 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zndr7/crc-debug-7jfqw"] Feb 17 09:58:29 crc kubenswrapper[4848]: I0217 09:58:29.967200 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99c6820c-4699-446a-9bfe-4fc4a6c83630-kube-api-access-n4rzl" (OuterVolumeSpecName: "kube-api-access-n4rzl") pod "99c6820c-4699-446a-9bfe-4fc4a6c83630" (UID: "99c6820c-4699-446a-9bfe-4fc4a6c83630"). InnerVolumeSpecName "kube-api-access-n4rzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:58:30 crc kubenswrapper[4848]: I0217 09:58:30.063065 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4rzl\" (UniqueName: \"kubernetes.io/projected/99c6820c-4699-446a-9bfe-4fc4a6c83630-kube-api-access-n4rzl\") on node \"crc\" DevicePath \"\"" Feb 17 09:58:30 crc kubenswrapper[4848]: I0217 09:58:30.823672 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39a9f15c18472f3063ecc483f5cf207ef4734df97f8044b069e72976dda96a08" Feb 17 09:58:30 crc kubenswrapper[4848]: I0217 09:58:30.823724 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zndr7/crc-debug-7jfqw" Feb 17 09:58:31 crc kubenswrapper[4848]: I0217 09:58:31.144509 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zndr7/crc-debug-wdwkh"] Feb 17 09:58:31 crc kubenswrapper[4848]: E0217 09:58:31.145115 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99c6820c-4699-446a-9bfe-4fc4a6c83630" containerName="container-00" Feb 17 09:58:31 crc kubenswrapper[4848]: I0217 09:58:31.145127 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="99c6820c-4699-446a-9bfe-4fc4a6c83630" containerName="container-00" Feb 17 09:58:31 crc kubenswrapper[4848]: I0217 09:58:31.145293 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="99c6820c-4699-446a-9bfe-4fc4a6c83630" containerName="container-00" Feb 17 09:58:31 crc kubenswrapper[4848]: I0217 09:58:31.145916 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zndr7/crc-debug-wdwkh" Feb 17 09:58:31 crc kubenswrapper[4848]: I0217 09:58:31.186710 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/27fb645b-ea86-4236-8987-2116cccc6c50-host\") pod \"crc-debug-wdwkh\" (UID: \"27fb645b-ea86-4236-8987-2116cccc6c50\") " pod="openshift-must-gather-zndr7/crc-debug-wdwkh" Feb 17 09:58:31 crc kubenswrapper[4848]: I0217 09:58:31.186862 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzfh9\" (UniqueName: \"kubernetes.io/projected/27fb645b-ea86-4236-8987-2116cccc6c50-kube-api-access-tzfh9\") pod \"crc-debug-wdwkh\" (UID: \"27fb645b-ea86-4236-8987-2116cccc6c50\") " pod="openshift-must-gather-zndr7/crc-debug-wdwkh" Feb 17 09:58:31 crc kubenswrapper[4848]: I0217 09:58:31.289857 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/27fb645b-ea86-4236-8987-2116cccc6c50-host\") pod \"crc-debug-wdwkh\" (UID: \"27fb645b-ea86-4236-8987-2116cccc6c50\") " pod="openshift-must-gather-zndr7/crc-debug-wdwkh" Feb 17 09:58:31 crc kubenswrapper[4848]: I0217 09:58:31.289971 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzfh9\" (UniqueName: \"kubernetes.io/projected/27fb645b-ea86-4236-8987-2116cccc6c50-kube-api-access-tzfh9\") pod \"crc-debug-wdwkh\" (UID: \"27fb645b-ea86-4236-8987-2116cccc6c50\") " pod="openshift-must-gather-zndr7/crc-debug-wdwkh" Feb 17 09:58:31 crc kubenswrapper[4848]: I0217 09:58:31.290080 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/27fb645b-ea86-4236-8987-2116cccc6c50-host\") pod \"crc-debug-wdwkh\" (UID: \"27fb645b-ea86-4236-8987-2116cccc6c50\") " pod="openshift-must-gather-zndr7/crc-debug-wdwkh" Feb 17 09:58:31 crc kubenswrapper[4848]: I0217 09:58:31.326400 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzfh9\" (UniqueName: \"kubernetes.io/projected/27fb645b-ea86-4236-8987-2116cccc6c50-kube-api-access-tzfh9\") pod \"crc-debug-wdwkh\" (UID: \"27fb645b-ea86-4236-8987-2116cccc6c50\") " pod="openshift-must-gather-zndr7/crc-debug-wdwkh" Feb 17 09:58:31 crc kubenswrapper[4848]: I0217 09:58:31.394572 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99c6820c-4699-446a-9bfe-4fc4a6c83630" path="/var/lib/kubelet/pods/99c6820c-4699-446a-9bfe-4fc4a6c83630/volumes" Feb 17 09:58:31 crc kubenswrapper[4848]: I0217 09:58:31.462911 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zndr7/crc-debug-wdwkh" Feb 17 09:58:31 crc kubenswrapper[4848]: I0217 09:58:31.836181 4848 generic.go:334] "Generic (PLEG): container finished" podID="27fb645b-ea86-4236-8987-2116cccc6c50" containerID="66b7997ba1139f0f321ce7b83d9ddb1c719f4d35c19fefccb6fa815d1a37cf67" exitCode=0 Feb 17 09:58:31 crc kubenswrapper[4848]: I0217 09:58:31.836418 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zndr7/crc-debug-wdwkh" event={"ID":"27fb645b-ea86-4236-8987-2116cccc6c50","Type":"ContainerDied","Data":"66b7997ba1139f0f321ce7b83d9ddb1c719f4d35c19fefccb6fa815d1a37cf67"} Feb 17 09:58:31 crc kubenswrapper[4848]: I0217 09:58:31.836708 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zndr7/crc-debug-wdwkh" event={"ID":"27fb645b-ea86-4236-8987-2116cccc6c50","Type":"ContainerStarted","Data":"3feca7d04251e72328c5549f0c18bde390bf090ff647a96e30b8d5a38c440578"} Feb 17 09:58:32 crc kubenswrapper[4848]: I0217 09:58:32.322328 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zndr7/crc-debug-wdwkh"] Feb 17 09:58:32 crc kubenswrapper[4848]: I0217 09:58:32.330816 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zndr7/crc-debug-wdwkh"] Feb 17 09:58:32 crc kubenswrapper[4848]: I0217 09:58:32.969249 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zndr7/crc-debug-wdwkh" Feb 17 09:58:33 crc kubenswrapper[4848]: I0217 09:58:33.021447 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzfh9\" (UniqueName: \"kubernetes.io/projected/27fb645b-ea86-4236-8987-2116cccc6c50-kube-api-access-tzfh9\") pod \"27fb645b-ea86-4236-8987-2116cccc6c50\" (UID: \"27fb645b-ea86-4236-8987-2116cccc6c50\") " Feb 17 09:58:33 crc kubenswrapper[4848]: I0217 09:58:33.021934 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/27fb645b-ea86-4236-8987-2116cccc6c50-host\") pod \"27fb645b-ea86-4236-8987-2116cccc6c50\" (UID: \"27fb645b-ea86-4236-8987-2116cccc6c50\") " Feb 17 09:58:33 crc kubenswrapper[4848]: I0217 09:58:33.022057 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/27fb645b-ea86-4236-8987-2116cccc6c50-host" (OuterVolumeSpecName: "host") pod "27fb645b-ea86-4236-8987-2116cccc6c50" (UID: "27fb645b-ea86-4236-8987-2116cccc6c50"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:58:33 crc kubenswrapper[4848]: I0217 09:58:33.023113 4848 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/27fb645b-ea86-4236-8987-2116cccc6c50-host\") on node \"crc\" DevicePath \"\"" Feb 17 09:58:33 crc kubenswrapper[4848]: I0217 09:58:33.032521 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27fb645b-ea86-4236-8987-2116cccc6c50-kube-api-access-tzfh9" (OuterVolumeSpecName: "kube-api-access-tzfh9") pod "27fb645b-ea86-4236-8987-2116cccc6c50" (UID: "27fb645b-ea86-4236-8987-2116cccc6c50"). InnerVolumeSpecName "kube-api-access-tzfh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:58:33 crc kubenswrapper[4848]: I0217 09:58:33.125181 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzfh9\" (UniqueName: \"kubernetes.io/projected/27fb645b-ea86-4236-8987-2116cccc6c50-kube-api-access-tzfh9\") on node \"crc\" DevicePath \"\"" Feb 17 09:58:33 crc kubenswrapper[4848]: I0217 09:58:33.396676 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27fb645b-ea86-4236-8987-2116cccc6c50" path="/var/lib/kubelet/pods/27fb645b-ea86-4236-8987-2116cccc6c50/volumes" Feb 17 09:58:33 crc kubenswrapper[4848]: I0217 09:58:33.565640 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zndr7/crc-debug-c9zb7"] Feb 17 09:58:33 crc kubenswrapper[4848]: E0217 09:58:33.566168 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27fb645b-ea86-4236-8987-2116cccc6c50" containerName="container-00" Feb 17 09:58:33 crc kubenswrapper[4848]: I0217 09:58:33.566199 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="27fb645b-ea86-4236-8987-2116cccc6c50" containerName="container-00" Feb 17 09:58:33 crc kubenswrapper[4848]: I0217 09:58:33.566516 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="27fb645b-ea86-4236-8987-2116cccc6c50" containerName="container-00" Feb 17 09:58:33 crc kubenswrapper[4848]: I0217 09:58:33.567335 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zndr7/crc-debug-c9zb7" Feb 17 09:58:33 crc kubenswrapper[4848]: I0217 09:58:33.635832 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de4ab4ea-ad1c-4c6f-852e-2636fe929069-host\") pod \"crc-debug-c9zb7\" (UID: \"de4ab4ea-ad1c-4c6f-852e-2636fe929069\") " pod="openshift-must-gather-zndr7/crc-debug-c9zb7" Feb 17 09:58:33 crc kubenswrapper[4848]: I0217 09:58:33.635956 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp4d4\" (UniqueName: \"kubernetes.io/projected/de4ab4ea-ad1c-4c6f-852e-2636fe929069-kube-api-access-zp4d4\") pod \"crc-debug-c9zb7\" (UID: \"de4ab4ea-ad1c-4c6f-852e-2636fe929069\") " pod="openshift-must-gather-zndr7/crc-debug-c9zb7" Feb 17 09:58:33 crc kubenswrapper[4848]: I0217 09:58:33.738137 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de4ab4ea-ad1c-4c6f-852e-2636fe929069-host\") pod \"crc-debug-c9zb7\" (UID: \"de4ab4ea-ad1c-4c6f-852e-2636fe929069\") " pod="openshift-must-gather-zndr7/crc-debug-c9zb7" Feb 17 09:58:33 crc kubenswrapper[4848]: I0217 09:58:33.738234 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp4d4\" (UniqueName: \"kubernetes.io/projected/de4ab4ea-ad1c-4c6f-852e-2636fe929069-kube-api-access-zp4d4\") pod \"crc-debug-c9zb7\" (UID: \"de4ab4ea-ad1c-4c6f-852e-2636fe929069\") " pod="openshift-must-gather-zndr7/crc-debug-c9zb7" Feb 17 09:58:33 crc kubenswrapper[4848]: I0217 09:58:33.738332 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de4ab4ea-ad1c-4c6f-852e-2636fe929069-host\") pod \"crc-debug-c9zb7\" (UID: \"de4ab4ea-ad1c-4c6f-852e-2636fe929069\") " pod="openshift-must-gather-zndr7/crc-debug-c9zb7" Feb 17 09:58:33 crc kubenswrapper[4848]: I0217 09:58:33.765793 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp4d4\" (UniqueName: \"kubernetes.io/projected/de4ab4ea-ad1c-4c6f-852e-2636fe929069-kube-api-access-zp4d4\") pod \"crc-debug-c9zb7\" (UID: \"de4ab4ea-ad1c-4c6f-852e-2636fe929069\") " pod="openshift-must-gather-zndr7/crc-debug-c9zb7" Feb 17 09:58:33 crc kubenswrapper[4848]: I0217 09:58:33.856347 4848 scope.go:117] "RemoveContainer" containerID="66b7997ba1139f0f321ce7b83d9ddb1c719f4d35c19fefccb6fa815d1a37cf67" Feb 17 09:58:33 crc kubenswrapper[4848]: I0217 09:58:33.856354 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zndr7/crc-debug-wdwkh" Feb 17 09:58:33 crc kubenswrapper[4848]: I0217 09:58:33.890209 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zndr7/crc-debug-c9zb7" Feb 17 09:58:34 crc kubenswrapper[4848]: I0217 09:58:34.383452 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 09:58:34 crc kubenswrapper[4848]: E0217 09:58:34.384132 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:58:34 crc kubenswrapper[4848]: I0217 09:58:34.874106 4848 generic.go:334] "Generic (PLEG): container finished" podID="de4ab4ea-ad1c-4c6f-852e-2636fe929069" containerID="53f54acd2871ab152afd4da07aca250bdad1cba8b23675e31b31a741f839fca9" exitCode=0 Feb 17 09:58:34 crc kubenswrapper[4848]: I0217 09:58:34.874158 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zndr7/crc-debug-c9zb7" event={"ID":"de4ab4ea-ad1c-4c6f-852e-2636fe929069","Type":"ContainerDied","Data":"53f54acd2871ab152afd4da07aca250bdad1cba8b23675e31b31a741f839fca9"} Feb 17 09:58:34 crc kubenswrapper[4848]: I0217 09:58:34.874200 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zndr7/crc-debug-c9zb7" event={"ID":"de4ab4ea-ad1c-4c6f-852e-2636fe929069","Type":"ContainerStarted","Data":"af13d31554cce3e430a81aef7847fd58b05116642cc887308932785c6a740ab4"} Feb 17 09:58:34 crc kubenswrapper[4848]: I0217 09:58:34.935369 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zndr7/crc-debug-c9zb7"] Feb 17 09:58:34 crc kubenswrapper[4848]: I0217 09:58:34.950250 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zndr7/crc-debug-c9zb7"] Feb 17 09:58:35 crc kubenswrapper[4848]: I0217 09:58:35.983321 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zndr7/crc-debug-c9zb7" Feb 17 09:58:36 crc kubenswrapper[4848]: I0217 09:58:36.079053 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de4ab4ea-ad1c-4c6f-852e-2636fe929069-host\") pod \"de4ab4ea-ad1c-4c6f-852e-2636fe929069\" (UID: \"de4ab4ea-ad1c-4c6f-852e-2636fe929069\") " Feb 17 09:58:36 crc kubenswrapper[4848]: I0217 09:58:36.079375 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp4d4\" (UniqueName: \"kubernetes.io/projected/de4ab4ea-ad1c-4c6f-852e-2636fe929069-kube-api-access-zp4d4\") pod \"de4ab4ea-ad1c-4c6f-852e-2636fe929069\" (UID: \"de4ab4ea-ad1c-4c6f-852e-2636fe929069\") " Feb 17 09:58:36 crc kubenswrapper[4848]: I0217 09:58:36.079137 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de4ab4ea-ad1c-4c6f-852e-2636fe929069-host" (OuterVolumeSpecName: "host") pod "de4ab4ea-ad1c-4c6f-852e-2636fe929069" (UID: "de4ab4ea-ad1c-4c6f-852e-2636fe929069"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 09:58:36 crc kubenswrapper[4848]: I0217 09:58:36.079906 4848 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de4ab4ea-ad1c-4c6f-852e-2636fe929069-host\") on node \"crc\" DevicePath \"\"" Feb 17 09:58:36 crc kubenswrapper[4848]: I0217 09:58:36.085040 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de4ab4ea-ad1c-4c6f-852e-2636fe929069-kube-api-access-zp4d4" (OuterVolumeSpecName: "kube-api-access-zp4d4") pod "de4ab4ea-ad1c-4c6f-852e-2636fe929069" (UID: "de4ab4ea-ad1c-4c6f-852e-2636fe929069"). InnerVolumeSpecName "kube-api-access-zp4d4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 09:58:36 crc kubenswrapper[4848]: I0217 09:58:36.181962 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp4d4\" (UniqueName: \"kubernetes.io/projected/de4ab4ea-ad1c-4c6f-852e-2636fe929069-kube-api-access-zp4d4\") on node \"crc\" DevicePath \"\"" Feb 17 09:58:36 crc kubenswrapper[4848]: I0217 09:58:36.897988 4848 scope.go:117] "RemoveContainer" containerID="53f54acd2871ab152afd4da07aca250bdad1cba8b23675e31b31a741f839fca9" Feb 17 09:58:36 crc kubenswrapper[4848]: I0217 09:58:36.898108 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zndr7/crc-debug-c9zb7" Feb 17 09:58:37 crc kubenswrapper[4848]: I0217 09:58:37.394932 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de4ab4ea-ad1c-4c6f-852e-2636fe929069" path="/var/lib/kubelet/pods/de4ab4ea-ad1c-4c6f-852e-2636fe929069/volumes" Feb 17 09:58:48 crc kubenswrapper[4848]: I0217 09:58:48.384171 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 09:58:48 crc kubenswrapper[4848]: E0217 09:58:48.385373 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:58:50 crc kubenswrapper[4848]: I0217 09:58:50.836173 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56c75f4b6d-vbll8_63d2c1b3-b181-4afe-8cb1-3049a34c47d2/barbican-api/0.log" Feb 17 09:58:51 crc kubenswrapper[4848]: I0217 09:58:51.017308 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56c75f4b6d-vbll8_63d2c1b3-b181-4afe-8cb1-3049a34c47d2/barbican-api-log/0.log" Feb 17 09:58:51 crc kubenswrapper[4848]: I0217 09:58:51.091124 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6c677d9df8-z5nnn_49b9137d-75ca-4b52-9338-6bf15270a667/barbican-keystone-listener/0.log" Feb 17 09:58:51 crc kubenswrapper[4848]: I0217 09:58:51.126246 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6c677d9df8-z5nnn_49b9137d-75ca-4b52-9338-6bf15270a667/barbican-keystone-listener-log/0.log" Feb 17 09:58:51 crc kubenswrapper[4848]: I0217 09:58:51.241254 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-848f449699-2nhmn_dd92adeb-535d-4d36-a176-b5cd3ca667dc/barbican-worker/0.log" Feb 17 09:58:51 crc kubenswrapper[4848]: I0217 09:58:51.338391 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-848f449699-2nhmn_dd92adeb-535d-4d36-a176-b5cd3ca667dc/barbican-worker-log/0.log" Feb 17 09:58:51 crc kubenswrapper[4848]: I0217 09:58:51.445342 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2_9eaf40fa-e0f2-445b-a17b-98f88fc76a5e/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 09:58:51 crc kubenswrapper[4848]: I0217 09:58:51.594817 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_65294004-8016-43fa-8017-90cb36bb8dcb/ceilometer-central-agent/0.log" Feb 17 09:58:51 crc kubenswrapper[4848]: I0217 09:58:51.775338 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_65294004-8016-43fa-8017-90cb36bb8dcb/ceilometer-notification-agent/0.log" Feb 17 09:58:51 crc kubenswrapper[4848]: I0217 09:58:51.781860 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_65294004-8016-43fa-8017-90cb36bb8dcb/proxy-httpd/0.log" Feb 17 09:58:51 crc kubenswrapper[4848]: I0217 09:58:51.866405 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_65294004-8016-43fa-8017-90cb36bb8dcb/sg-core/0.log" Feb 17 09:58:51 crc kubenswrapper[4848]: I0217 09:58:51.983642 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a637d69f-499a-4308-89a8-fad8fe4e6d59/cinder-api/0.log" Feb 17 09:58:51 crc kubenswrapper[4848]: I0217 09:58:51.989953 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a637d69f-499a-4308-89a8-fad8fe4e6d59/cinder-api-log/0.log" Feb 17 09:58:52 crc kubenswrapper[4848]: I0217 09:58:52.137277 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_96a17dca-14f1-42ed-aca6-45fc15067cd3/cinder-scheduler/0.log" Feb 17 09:58:52 crc kubenswrapper[4848]: I0217 09:58:52.195331 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_96a17dca-14f1-42ed-aca6-45fc15067cd3/probe/0.log" Feb 17 09:58:52 crc kubenswrapper[4848]: I0217 09:58:52.315368 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv_9c1fceab-33b4-4eee-8e26-c9bc2a35f018/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 09:58:52 crc kubenswrapper[4848]: I0217 09:58:52.411581 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h_cbf54dd4-b933-400a-bef2-44bc87fbf3de/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 09:58:52 crc kubenswrapper[4848]: I0217 09:58:52.528897 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-b9df5dcdc-8rdwv_84c45378-b510-419e-83b7-b92a19292d39/init/0.log" Feb 17 09:58:52 crc kubenswrapper[4848]: I0217 09:58:52.683709 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-b9df5dcdc-8rdwv_84c45378-b510-419e-83b7-b92a19292d39/init/0.log" Feb 17 09:58:52 crc kubenswrapper[4848]: I0217 09:58:52.739975 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-b9df5dcdc-8rdwv_84c45378-b510-419e-83b7-b92a19292d39/dnsmasq-dns/0.log" Feb 17 09:58:52 crc kubenswrapper[4848]: I0217 09:58:52.749480 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-nztrh_502cc85d-fb83-4c34-825f-5aca6c880af7/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 09:58:52 crc kubenswrapper[4848]: I0217 09:58:52.899776 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379/glance-httpd/0.log" Feb 17 09:58:52 crc kubenswrapper[4848]: I0217 09:58:52.902070 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379/glance-log/0.log" Feb 17 09:58:53 crc kubenswrapper[4848]: I0217 09:58:53.044036 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5f89c214-b934-465f-86ee-dec5f742237e/glance-log/0.log" Feb 17 09:58:53 crc kubenswrapper[4848]: I0217 09:58:53.081326 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5f89c214-b934-465f-86ee-dec5f742237e/glance-httpd/0.log" Feb 17 09:58:53 crc kubenswrapper[4848]: I0217 09:58:53.231939 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-676bdd79dd-lq228_96fd6f0e-96ad-4a88-85ff-78f450b24279/horizon/0.log" Feb 17 09:58:53 crc kubenswrapper[4848]: I0217 09:58:53.372538 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s_3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 09:58:53 crc kubenswrapper[4848]: I0217 09:58:53.530072 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-676bdd79dd-lq228_96fd6f0e-96ad-4a88-85ff-78f450b24279/horizon-log/0.log" Feb 17 09:58:53 crc kubenswrapper[4848]: I0217 09:58:53.568537 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-zp8bp_55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 09:58:53 crc kubenswrapper[4848]: I0217 09:58:53.836234 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_10c3da4a-6e24-4b18-8c11-26d2255aebcc/kube-state-metrics/0.log" Feb 17 09:58:53 crc kubenswrapper[4848]: I0217 09:58:53.918885 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-76c7ffd8bf-x42cc_fc74976d-87c5-406c-9e25-4f89c5fc2307/keystone-api/0.log" Feb 17 09:58:54 crc kubenswrapper[4848]: I0217 09:58:54.039825 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk_a694769e-5bc0-4596-945c-2de9823168f0/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 09:58:54 crc kubenswrapper[4848]: I0217 09:58:54.374747 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-685b8f6845-8tvq5_e060d08b-cb90-4fe6-badb-ae482aeb505d/neutron-httpd/0.log" Feb 17 09:58:54 crc kubenswrapper[4848]: I0217 09:58:54.411653 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-685b8f6845-8tvq5_e060d08b-cb90-4fe6-badb-ae482aeb505d/neutron-api/0.log" Feb 17 09:58:54 crc kubenswrapper[4848]: I0217 09:58:54.653055 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b_cb01f719-b45c-48ab-ba4a-6ffeef0d8b92/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 09:58:55 crc kubenswrapper[4848]: I0217 09:58:55.290827 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7/nova-api-log/0.log" Feb 17 09:58:55 crc kubenswrapper[4848]: I0217 09:58:55.338178 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_4a8dbe1d-cea3-4cf7-a8ef-210410453732/nova-cell0-conductor-conductor/0.log" Feb 17 09:58:55 crc kubenswrapper[4848]: I0217 09:58:55.523130 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7/nova-api-api/0.log" Feb 17 09:58:55 crc kubenswrapper[4848]: I0217 09:58:55.577485 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_828cf207-286c-427a-80f4-5713b1128ecc/nova-cell1-conductor-conductor/0.log" Feb 17 09:58:55 crc kubenswrapper[4848]: I0217 09:58:55.614249 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_0b2f03bc-3c70-4dc8-9478-5474155fdf90/nova-cell1-novncproxy-novncproxy/0.log" Feb 17 09:58:55 crc kubenswrapper[4848]: I0217 09:58:55.837050 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-jnqsf_8b957c61-bd51-415a-9d34-da20cb8ebd55/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 09:58:55 crc kubenswrapper[4848]: I0217 09:58:55.914770 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_28ef5b18-4ce7-4850-b6c8-70e0727fc805/nova-metadata-log/0.log" Feb 17 09:58:56 crc kubenswrapper[4848]: I0217 09:58:56.316412 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_627e2420-02b2-4269-adc9-573cd91cccd9/nova-scheduler-scheduler/0.log" Feb 17 09:58:56 crc kubenswrapper[4848]: I0217 09:58:56.450914 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_241bdede-0e36-4cfa-965b-89449d5f84f0/mysql-bootstrap/0.log" Feb 17 09:58:56 crc kubenswrapper[4848]: I0217 09:58:56.558170 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_241bdede-0e36-4cfa-965b-89449d5f84f0/galera/0.log" Feb 17 09:58:56 crc kubenswrapper[4848]: I0217 09:58:56.578602 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_241bdede-0e36-4cfa-965b-89449d5f84f0/mysql-bootstrap/0.log" Feb 17 09:58:56 crc kubenswrapper[4848]: I0217 09:58:56.829252 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ac51f8f5-cf36-44ef-b849-9bd6265e5156/mysql-bootstrap/0.log" Feb 17 09:58:57 crc kubenswrapper[4848]: I0217 09:58:57.019251 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ac51f8f5-cf36-44ef-b849-9bd6265e5156/mysql-bootstrap/0.log" Feb 17 09:58:57 crc kubenswrapper[4848]: I0217 09:58:57.030908 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ac51f8f5-cf36-44ef-b849-9bd6265e5156/galera/0.log" Feb 17 09:58:57 crc kubenswrapper[4848]: I0217 09:58:57.138220 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_28ef5b18-4ce7-4850-b6c8-70e0727fc805/nova-metadata-metadata/0.log" Feb 17 09:58:57 crc kubenswrapper[4848]: I0217 09:58:57.230194 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_5211bb87-9d50-485f-aa61-43f8d57339c7/openstackclient/0.log" Feb 17 09:58:57 crc kubenswrapper[4848]: I0217 09:58:57.335795 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-c695f_43e80552-f64e-4257-a460-f108ee513c12/ovn-controller/0.log" Feb 17 09:58:57 crc kubenswrapper[4848]: I0217 09:58:57.480374 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-28cvn_1f7ecdca-433f-4bcc-a3d5-e433a8db3bad/openstack-network-exporter/0.log" Feb 17 09:58:57 crc kubenswrapper[4848]: I0217 09:58:57.558120 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jbwkv_e09d3b82-ad17-461a-89eb-b8ee45d4edff/ovsdb-server-init/0.log" Feb 17 09:58:57 crc kubenswrapper[4848]: I0217 09:58:57.774061 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jbwkv_e09d3b82-ad17-461a-89eb-b8ee45d4edff/ovs-vswitchd/0.log" Feb 17 09:58:57 crc kubenswrapper[4848]: I0217 09:58:57.786817 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jbwkv_e09d3b82-ad17-461a-89eb-b8ee45d4edff/ovsdb-server-init/0.log" Feb 17 09:58:57 crc kubenswrapper[4848]: I0217 09:58:57.795982 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jbwkv_e09d3b82-ad17-461a-89eb-b8ee45d4edff/ovsdb-server/0.log" Feb 17 09:58:57 crc kubenswrapper[4848]: I0217 09:58:57.983825 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5263f1c0-e02a-4383-ae9f-3b223486a59e/openstack-network-exporter/0.log" Feb 17 09:58:58 crc kubenswrapper[4848]: I0217 09:58:58.042705 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-mq87l_3db20d69-cab8-4176-a71c-172899e90c3d/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 09:58:58 crc kubenswrapper[4848]: I0217 09:58:58.054573 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5263f1c0-e02a-4383-ae9f-3b223486a59e/ovn-northd/0.log" Feb 17 09:58:58 crc kubenswrapper[4848]: I0217 09:58:58.242833 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9ccc9c6f-4e19-464f-9e06-7a3951c63c85/openstack-network-exporter/0.log" Feb 17 09:58:58 crc kubenswrapper[4848]: I0217 09:58:58.262235 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9ccc9c6f-4e19-464f-9e06-7a3951c63c85/ovsdbserver-nb/0.log" Feb 17 09:58:58 crc kubenswrapper[4848]: I0217 09:58:58.423003 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0c03c6cc-b85f-465f-b692-8f50eaca7cd6/openstack-network-exporter/0.log" Feb 17 09:58:58 crc kubenswrapper[4848]: I0217 09:58:58.503116 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0c03c6cc-b85f-465f-b692-8f50eaca7cd6/ovsdbserver-sb/0.log" Feb 17 09:58:58 crc kubenswrapper[4848]: I0217 09:58:58.606873 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7b6cdb54-tkxbl_eba884ff-2e19-4dca-ba2e-75a8a311ea19/placement-api/0.log" Feb 17 09:58:58 crc kubenswrapper[4848]: I0217 09:58:58.741794 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b5fc69cf-e25e-4ca3-adc3-36b1678691e1/setup-container/0.log" Feb 17 09:58:58 crc kubenswrapper[4848]: I0217 09:58:58.743289 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7b6cdb54-tkxbl_eba884ff-2e19-4dca-ba2e-75a8a311ea19/placement-log/0.log" Feb 17 09:58:58 crc kubenswrapper[4848]: I0217 09:58:58.940646 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b5fc69cf-e25e-4ca3-adc3-36b1678691e1/setup-container/0.log" Feb 17 09:58:59 crc kubenswrapper[4848]: I0217 09:58:59.012472 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b5fc69cf-e25e-4ca3-adc3-36b1678691e1/rabbitmq/0.log" Feb 17 09:58:59 crc kubenswrapper[4848]: I0217 09:58:59.066993 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_30e48298-cbbd-4637-83a9-733efaaf0756/setup-container/0.log" Feb 17 09:58:59 crc kubenswrapper[4848]: I0217 09:58:59.231250 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_30e48298-cbbd-4637-83a9-733efaaf0756/setup-container/0.log" Feb 17 09:58:59 crc kubenswrapper[4848]: I0217 09:58:59.248934 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_30e48298-cbbd-4637-83a9-733efaaf0756/rabbitmq/0.log" Feb 17 09:58:59 crc kubenswrapper[4848]: I0217 09:58:59.274268 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6_3230b202-405b-4545-b04f-8c01231f565e/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 09:58:59 crc kubenswrapper[4848]: I0217 09:58:59.383106 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 09:58:59 crc kubenswrapper[4848]: E0217 09:58:59.383352 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:58:59 crc kubenswrapper[4848]: I0217 09:58:59.435447 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-hm9jz_9762dbd7-6ed8-433c-a176-402586491e40/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 09:58:59 crc kubenswrapper[4848]: I0217 09:58:59.494738 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29_6dc6526d-a2c1-40d1-a503-71c4315cc00c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 09:58:59 crc kubenswrapper[4848]: I0217 09:58:59.721495 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-2vff5_14190cb8-1489-4fb2-8c06-0eb40f1f584e/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 09:58:59 crc kubenswrapper[4848]: I0217 09:58:59.799073 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-6gvlr_9042f387-8534-4c6e-a64a-08154984ff7d/ssh-known-hosts-edpm-deployment/0.log" Feb 17 09:59:00 crc kubenswrapper[4848]: I0217 09:59:00.054621 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-f467db6f-7x6cx_abcdb3d8-da38-472a-bdb3-e1615f832970/proxy-httpd/0.log" Feb 17 09:59:00 crc kubenswrapper[4848]: I0217 09:59:00.061939 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-f467db6f-7x6cx_abcdb3d8-da38-472a-bdb3-e1615f832970/proxy-server/0.log" Feb 17 09:59:00 crc kubenswrapper[4848]: I0217 09:59:00.185837 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-c6lrl_6a049c1c-b425-44cc-bde0-2e83be29d1a1/swift-ring-rebalance/0.log" Feb 17 09:59:00 crc kubenswrapper[4848]: I0217 09:59:00.272101 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/account-auditor/0.log" Feb 17 09:59:00 crc kubenswrapper[4848]: I0217 09:59:00.331219 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/account-reaper/0.log" Feb 17 09:59:00 crc kubenswrapper[4848]: I0217 09:59:00.447703 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/account-replicator/0.log" Feb 17 09:59:00 crc kubenswrapper[4848]: I0217 09:59:00.530197 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/account-server/0.log" Feb 17 09:59:00 crc kubenswrapper[4848]: I0217 09:59:00.581356 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/container-replicator/0.log" Feb 17 09:59:00 crc kubenswrapper[4848]: I0217 09:59:00.599898 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/container-auditor/0.log" Feb 17 09:59:00 crc kubenswrapper[4848]: I0217 09:59:00.668370 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/container-server/0.log" Feb 17 09:59:00 crc kubenswrapper[4848]: I0217 09:59:00.760817 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/container-updater/0.log" Feb 17 09:59:00 crc kubenswrapper[4848]: I0217 09:59:00.819005 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/object-auditor/0.log" Feb 17 09:59:00 crc kubenswrapper[4848]: I0217 09:59:00.831601 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/object-expirer/0.log" Feb 17 09:59:00 crc kubenswrapper[4848]: I0217 09:59:00.892262 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/object-replicator/0.log" Feb 17 09:59:01 crc kubenswrapper[4848]: I0217 09:59:01.015478 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/object-server/0.log" Feb 17 09:59:01 crc kubenswrapper[4848]: I0217 09:59:01.026128 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/object-updater/0.log" Feb 17 09:59:01 crc kubenswrapper[4848]: I0217 09:59:01.090299 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/rsync/0.log" Feb 17 09:59:01 crc kubenswrapper[4848]: I0217 09:59:01.127138 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/swift-recon-cron/0.log" Feb 17 09:59:01 crc kubenswrapper[4848]: I0217 09:59:01.375416 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_d195d00b-8819-4a35-9e3c-6b4b21660400/tempest-tests-tempest-tests-runner/0.log" Feb 17 09:59:01 crc kubenswrapper[4848]: I0217 09:59:01.380645 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-x4455_78d70d99-629a-4211-9ead-66a16b766326/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 09:59:01 crc kubenswrapper[4848]: I0217 09:59:01.635727 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_c3f9f72f-9857-440a-a108-80d6b424f2a3/test-operator-logs-container/0.log" Feb 17 09:59:01 crc kubenswrapper[4848]: I0217 09:59:01.653896 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd_31a5681d-60a0-455a-af52-e43f66fb1e93/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 09:59:09 crc kubenswrapper[4848]: I0217 09:59:09.628345 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_c60672b9-d590-48a6-80c0-e3f74547b5c2/memcached/0.log" Feb 17 09:59:13 crc kubenswrapper[4848]: I0217 09:59:13.389171 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 09:59:13 crc kubenswrapper[4848]: E0217 09:59:13.389683 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:59:25 crc kubenswrapper[4848]: I0217 09:59:25.685074 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw_0eda6969-cb8d-4b90-84a0-606b61156a05/util/0.log" Feb 17 09:59:25 crc kubenswrapper[4848]: I0217 09:59:25.804921 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw_0eda6969-cb8d-4b90-84a0-606b61156a05/util/0.log" Feb 17 09:59:25 crc kubenswrapper[4848]: I0217 09:59:25.856933 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw_0eda6969-cb8d-4b90-84a0-606b61156a05/pull/0.log" Feb 17 09:59:25 crc kubenswrapper[4848]: I0217 09:59:25.889022 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw_0eda6969-cb8d-4b90-84a0-606b61156a05/pull/0.log" Feb 17 09:59:26 crc kubenswrapper[4848]: I0217 09:59:26.083162 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw_0eda6969-cb8d-4b90-84a0-606b61156a05/util/0.log" Feb 17 09:59:26 crc kubenswrapper[4848]: I0217 09:59:26.098920 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw_0eda6969-cb8d-4b90-84a0-606b61156a05/pull/0.log" Feb 17 09:59:26 crc kubenswrapper[4848]: I0217 09:59:26.112424 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw_0eda6969-cb8d-4b90-84a0-606b61156a05/extract/0.log" Feb 17 09:59:26 crc kubenswrapper[4848]: I0217 09:59:26.775285 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-fbwmm_05876a75-9b3e-45b7-a3fe-89ab569742fd/manager/0.log" Feb 17 09:59:27 crc kubenswrapper[4848]: I0217 09:59:27.138355 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-958lw_17a4dcbd-4735-48d6-a575-f7d3af6843f1/manager/0.log" Feb 17 09:59:27 crc kubenswrapper[4848]: I0217 09:59:27.395380 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-86z5g_32c32c38-9ebf-4e9a-bea8-e761159dda5f/manager/0.log" Feb 17 09:59:27 crc kubenswrapper[4848]: I0217 09:59:27.588716 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-wlll6_b2e407ed-c962-4fcf-b367-f4164d644de6/manager/0.log" Feb 17 09:59:27 crc kubenswrapper[4848]: I0217 09:59:27.747352 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-5lpnr_ce2d3288-2b7d-4db8-861d-0a413fc90222/manager/0.log" Feb 17 09:59:28 crc kubenswrapper[4848]: I0217 09:59:28.084960 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-sw5bf_04f9fe37-de58-4b62-896e-0945a7bcbfdf/manager/0.log" Feb 17 09:59:28 crc kubenswrapper[4848]: I0217 09:59:28.263947 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-dlmq4_a2de98b6-28a9-446d-bc9b-ac7aad58be7d/manager/0.log" Feb 17 09:59:28 crc kubenswrapper[4848]: I0217 09:59:28.376956 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-9wvc8_97430748-300a-434e-a6b3-52274422ab66/manager/0.log" Feb 17 09:59:28 crc kubenswrapper[4848]: I0217 09:59:28.383256 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 09:59:28 crc kubenswrapper[4848]: E0217 09:59:28.383495 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:59:28 crc kubenswrapper[4848]: I0217 09:59:28.455482 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-qd7ds_e6251943-952f-4cbc-924c-b362d9f7c8da/manager/0.log" Feb 17 09:59:28 crc kubenswrapper[4848]: I0217 09:59:28.729284 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-lth6q_f151e0ea-ac05-426d-aa94-e32cc25fdc09/manager/0.log" Feb 17 09:59:28 crc kubenswrapper[4848]: I0217 09:59:28.917294 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-fkvjd_aa45cd11-5d86-47c3-b46e-15c0b204feb6/manager/0.log" Feb 17 09:59:29 crc kubenswrapper[4848]: I0217 09:59:29.174777 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-rvzql_88153939-7ca7-448d-a21c-b8330360b5a1/manager/0.log" Feb 17 09:59:29 crc kubenswrapper[4848]: I0217 09:59:29.440387 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr_f658d1a9-916e-41c9-8268-e94c22c6a045/manager/0.log" Feb 17 09:59:29 crc kubenswrapper[4848]: I0217 09:59:29.918040 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-7f8db498b4-ps9gz_4fa72ae4-db9b-4093-92bf-7aeb4ab8b7ab/operator/0.log" Feb 17 09:59:30 crc kubenswrapper[4848]: I0217 09:59:30.398702 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-b2s6b_7b87255a-321f-4b26-bc23-a7d5aeff53e2/registry-server/0.log" Feb 17 09:59:30 crc kubenswrapper[4848]: I0217 09:59:30.675592 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-mcnbc_aebf2b53-aeb7-44a1-a2c2-3dfaa0e01bdd/manager/0.log" Feb 17 09:59:30 crc kubenswrapper[4848]: I0217 09:59:30.850384 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-ckggs_653da755-b43b-4da9-bfd9-e8ee0bb44cc4/manager/0.log" Feb 17 09:59:30 crc kubenswrapper[4848]: I0217 09:59:30.923132 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-ckxfq_3922bb1d-9f36-4ffc-b382-a54c1c213008/manager/0.log" Feb 17 09:59:31 crc kubenswrapper[4848]: I0217 09:59:31.105170 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-hhb2h_a150a634-4cfd-4d77-ada7-5ab1f65a8985/operator/0.log" Feb 17 09:59:31 crc kubenswrapper[4848]: I0217 09:59:31.388618 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-mltd5_6b8d9b10-d577-4621-88d4-6f26e692a502/manager/0.log" Feb 17 09:59:31 crc kubenswrapper[4848]: I0217 09:59:31.519390 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-jcd2x_bced4dcb-9bee-42a5-9e52-e6ddc83f8f06/manager/0.log" Feb 17 09:59:31 crc kubenswrapper[4848]: I0217 09:59:31.673195 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-4phd8_632a67ba-e5ae-43bb-a69e-49cf64c054e4/manager/0.log" Feb 17 09:59:31 crc kubenswrapper[4848]: I0217 09:59:31.815555 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-7swkp_e72f9717-510f-4f9e-8557-ccd69b4dc61c/manager/0.log" Feb 17 09:59:31 crc kubenswrapper[4848]: I0217 09:59:31.946462 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-74d597bfd6-4ptpp_32b36aa1-7151-443d-9091-bc1e8ea86805/manager/0.log" Feb 17 09:59:33 crc kubenswrapper[4848]: I0217 09:59:33.289615 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-vqb7z_cf85b89f-2556-4ee7-a12b-6a4379f962e9/manager/0.log" Feb 17 09:59:40 crc kubenswrapper[4848]: I0217 09:59:40.384666 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 09:59:40 crc kubenswrapper[4848]: E0217 09:59:40.385987 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:59:51 crc kubenswrapper[4848]: I0217 09:59:51.383826 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 09:59:51 crc kubenswrapper[4848]: E0217 09:59:51.385451 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 09:59:51 crc kubenswrapper[4848]: I0217 09:59:51.543794 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-4cw7d_f7b4b17a-ac26-4e3a-8f67-d0c855b6aa95/control-plane-machine-set-operator/0.log" Feb 17 09:59:51 crc kubenswrapper[4848]: I0217 09:59:51.738439 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fqhrm_1ebf9d1e-e313-440d-992a-9e0ede5b2b24/kube-rbac-proxy/0.log" Feb 17 09:59:51 crc kubenswrapper[4848]: I0217 09:59:51.785265 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fqhrm_1ebf9d1e-e313-440d-992a-9e0ede5b2b24/machine-api-operator/0.log" Feb 17 10:00:00 crc kubenswrapper[4848]: I0217 10:00:00.145425 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522040-jfpmx"] Feb 17 10:00:00 crc kubenswrapper[4848]: E0217 10:00:00.146512 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de4ab4ea-ad1c-4c6f-852e-2636fe929069" containerName="container-00" Feb 17 10:00:00 crc kubenswrapper[4848]: I0217 10:00:00.146532 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="de4ab4ea-ad1c-4c6f-852e-2636fe929069" containerName="container-00" Feb 17 10:00:00 crc kubenswrapper[4848]: I0217 10:00:00.146833 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="de4ab4ea-ad1c-4c6f-852e-2636fe929069" containerName="container-00" Feb 17 10:00:00 crc kubenswrapper[4848]: I0217 10:00:00.147580 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522040-jfpmx" Feb 17 10:00:00 crc kubenswrapper[4848]: I0217 10:00:00.155320 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 10:00:00 crc kubenswrapper[4848]: I0217 10:00:00.155507 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 10:00:00 crc kubenswrapper[4848]: I0217 10:00:00.163566 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522040-jfpmx"] Feb 17 10:00:00 crc kubenswrapper[4848]: I0217 10:00:00.345011 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dbf5ca97-e3df-4f80-94f3-196d15104768-config-volume\") pod \"collect-profiles-29522040-jfpmx\" (UID: \"dbf5ca97-e3df-4f80-94f3-196d15104768\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522040-jfpmx" Feb 17 10:00:00 crc kubenswrapper[4848]: I0217 10:00:00.345354 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnm75\" (UniqueName: \"kubernetes.io/projected/dbf5ca97-e3df-4f80-94f3-196d15104768-kube-api-access-vnm75\") pod \"collect-profiles-29522040-jfpmx\" (UID: \"dbf5ca97-e3df-4f80-94f3-196d15104768\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522040-jfpmx" Feb 17 10:00:00 crc kubenswrapper[4848]: I0217 10:00:00.345449 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dbf5ca97-e3df-4f80-94f3-196d15104768-secret-volume\") pod \"collect-profiles-29522040-jfpmx\" (UID: \"dbf5ca97-e3df-4f80-94f3-196d15104768\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522040-jfpmx" Feb 17 10:00:00 crc kubenswrapper[4848]: I0217 10:00:00.446869 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnm75\" (UniqueName: \"kubernetes.io/projected/dbf5ca97-e3df-4f80-94f3-196d15104768-kube-api-access-vnm75\") pod \"collect-profiles-29522040-jfpmx\" (UID: \"dbf5ca97-e3df-4f80-94f3-196d15104768\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522040-jfpmx" Feb 17 10:00:00 crc kubenswrapper[4848]: I0217 10:00:00.446938 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dbf5ca97-e3df-4f80-94f3-196d15104768-secret-volume\") pod \"collect-profiles-29522040-jfpmx\" (UID: \"dbf5ca97-e3df-4f80-94f3-196d15104768\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522040-jfpmx" Feb 17 10:00:00 crc kubenswrapper[4848]: I0217 10:00:00.447071 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dbf5ca97-e3df-4f80-94f3-196d15104768-config-volume\") pod \"collect-profiles-29522040-jfpmx\" (UID: \"dbf5ca97-e3df-4f80-94f3-196d15104768\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522040-jfpmx" Feb 17 10:00:00 crc kubenswrapper[4848]: I0217 10:00:00.447875 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dbf5ca97-e3df-4f80-94f3-196d15104768-config-volume\") pod \"collect-profiles-29522040-jfpmx\" (UID: \"dbf5ca97-e3df-4f80-94f3-196d15104768\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522040-jfpmx" Feb 17 10:00:00 crc kubenswrapper[4848]: I0217 10:00:00.453339 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dbf5ca97-e3df-4f80-94f3-196d15104768-secret-volume\") pod \"collect-profiles-29522040-jfpmx\" (UID: \"dbf5ca97-e3df-4f80-94f3-196d15104768\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522040-jfpmx" Feb 17 10:00:00 crc kubenswrapper[4848]: I0217 10:00:00.470349 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnm75\" (UniqueName: \"kubernetes.io/projected/dbf5ca97-e3df-4f80-94f3-196d15104768-kube-api-access-vnm75\") pod \"collect-profiles-29522040-jfpmx\" (UID: \"dbf5ca97-e3df-4f80-94f3-196d15104768\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522040-jfpmx" Feb 17 10:00:00 crc kubenswrapper[4848]: I0217 10:00:00.477886 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522040-jfpmx" Feb 17 10:00:00 crc kubenswrapper[4848]: I0217 10:00:00.958261 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522040-jfpmx"] Feb 17 10:00:01 crc kubenswrapper[4848]: I0217 10:00:01.401835 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jvvs9"] Feb 17 10:00:01 crc kubenswrapper[4848]: I0217 10:00:01.404036 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvvs9" Feb 17 10:00:01 crc kubenswrapper[4848]: I0217 10:00:01.414509 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jvvs9"] Feb 17 10:00:01 crc kubenswrapper[4848]: I0217 10:00:01.566703 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxbk7\" (UniqueName: \"kubernetes.io/projected/dadd5991-7ee3-4b01-8993-faa1b986013c-kube-api-access-qxbk7\") pod \"redhat-operators-jvvs9\" (UID: \"dadd5991-7ee3-4b01-8993-faa1b986013c\") " pod="openshift-marketplace/redhat-operators-jvvs9" Feb 17 10:00:01 crc kubenswrapper[4848]: I0217 10:00:01.566859 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dadd5991-7ee3-4b01-8993-faa1b986013c-utilities\") pod \"redhat-operators-jvvs9\" (UID: \"dadd5991-7ee3-4b01-8993-faa1b986013c\") " pod="openshift-marketplace/redhat-operators-jvvs9" Feb 17 10:00:01 crc kubenswrapper[4848]: I0217 10:00:01.566968 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dadd5991-7ee3-4b01-8993-faa1b986013c-catalog-content\") pod \"redhat-operators-jvvs9\" (UID: \"dadd5991-7ee3-4b01-8993-faa1b986013c\") " pod="openshift-marketplace/redhat-operators-jvvs9" Feb 17 10:00:01 crc kubenswrapper[4848]: I0217 10:00:01.668513 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dadd5991-7ee3-4b01-8993-faa1b986013c-catalog-content\") pod \"redhat-operators-jvvs9\" (UID: \"dadd5991-7ee3-4b01-8993-faa1b986013c\") " pod="openshift-marketplace/redhat-operators-jvvs9" Feb 17 10:00:01 crc kubenswrapper[4848]: I0217 10:00:01.668695 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxbk7\" (UniqueName: \"kubernetes.io/projected/dadd5991-7ee3-4b01-8993-faa1b986013c-kube-api-access-qxbk7\") pod \"redhat-operators-jvvs9\" (UID: \"dadd5991-7ee3-4b01-8993-faa1b986013c\") " pod="openshift-marketplace/redhat-operators-jvvs9" Feb 17 10:00:01 crc kubenswrapper[4848]: I0217 10:00:01.668763 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dadd5991-7ee3-4b01-8993-faa1b986013c-utilities\") pod \"redhat-operators-jvvs9\" (UID: \"dadd5991-7ee3-4b01-8993-faa1b986013c\") " pod="openshift-marketplace/redhat-operators-jvvs9" Feb 17 10:00:01 crc kubenswrapper[4848]: I0217 10:00:01.669146 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dadd5991-7ee3-4b01-8993-faa1b986013c-catalog-content\") pod \"redhat-operators-jvvs9\" (UID: \"dadd5991-7ee3-4b01-8993-faa1b986013c\") " pod="openshift-marketplace/redhat-operators-jvvs9" Feb 17 10:00:01 crc kubenswrapper[4848]: I0217 10:00:01.669211 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dadd5991-7ee3-4b01-8993-faa1b986013c-utilities\") pod \"redhat-operators-jvvs9\" (UID: \"dadd5991-7ee3-4b01-8993-faa1b986013c\") " pod="openshift-marketplace/redhat-operators-jvvs9" Feb 17 10:00:01 crc kubenswrapper[4848]: I0217 10:00:01.672096 4848 generic.go:334] "Generic (PLEG): container finished" podID="dbf5ca97-e3df-4f80-94f3-196d15104768" containerID="564e84b4ff22716585aa2485f974343fd96dbccc5db6a2f1e07ff5a98de71149" exitCode=0 Feb 17 10:00:01 crc kubenswrapper[4848]: I0217 10:00:01.672157 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522040-jfpmx" event={"ID":"dbf5ca97-e3df-4f80-94f3-196d15104768","Type":"ContainerDied","Data":"564e84b4ff22716585aa2485f974343fd96dbccc5db6a2f1e07ff5a98de71149"} Feb 17 10:00:01 crc kubenswrapper[4848]: I0217 10:00:01.672193 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522040-jfpmx" event={"ID":"dbf5ca97-e3df-4f80-94f3-196d15104768","Type":"ContainerStarted","Data":"3526a80ad06e4a05ebf6fb217d7ba12e57892a4e5b6c48b5ebb364fb7ec84ee2"} Feb 17 10:00:01 crc kubenswrapper[4848]: I0217 10:00:01.687511 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxbk7\" (UniqueName: \"kubernetes.io/projected/dadd5991-7ee3-4b01-8993-faa1b986013c-kube-api-access-qxbk7\") pod \"redhat-operators-jvvs9\" (UID: \"dadd5991-7ee3-4b01-8993-faa1b986013c\") " pod="openshift-marketplace/redhat-operators-jvvs9" Feb 17 10:00:01 crc kubenswrapper[4848]: I0217 10:00:01.746706 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvvs9" Feb 17 10:00:02 crc kubenswrapper[4848]: I0217 10:00:02.220496 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jvvs9"] Feb 17 10:00:02 crc kubenswrapper[4848]: I0217 10:00:02.383925 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 10:00:02 crc kubenswrapper[4848]: E0217 10:00:02.384212 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:00:02 crc kubenswrapper[4848]: I0217 10:00:02.681869 4848 generic.go:334] "Generic (PLEG): container finished" podID="dadd5991-7ee3-4b01-8993-faa1b986013c" containerID="f676d67f635d987fcbc53a28128cb5af727f4317333949615d8db290248106da" exitCode=0 Feb 17 10:00:02 crc kubenswrapper[4848]: I0217 10:00:02.682103 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvvs9" event={"ID":"dadd5991-7ee3-4b01-8993-faa1b986013c","Type":"ContainerDied","Data":"f676d67f635d987fcbc53a28128cb5af727f4317333949615d8db290248106da"} Feb 17 10:00:02 crc kubenswrapper[4848]: I0217 10:00:02.682166 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvvs9" event={"ID":"dadd5991-7ee3-4b01-8993-faa1b986013c","Type":"ContainerStarted","Data":"0f95c152e7ef5a4e1cdd557651466e21488a27c5b6fe18267373bc379b1fedb4"} Feb 17 10:00:03 crc kubenswrapper[4848]: I0217 10:00:03.008644 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522040-jfpmx" Feb 17 10:00:03 crc kubenswrapper[4848]: I0217 10:00:03.097011 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnm75\" (UniqueName: \"kubernetes.io/projected/dbf5ca97-e3df-4f80-94f3-196d15104768-kube-api-access-vnm75\") pod \"dbf5ca97-e3df-4f80-94f3-196d15104768\" (UID: \"dbf5ca97-e3df-4f80-94f3-196d15104768\") " Feb 17 10:00:03 crc kubenswrapper[4848]: I0217 10:00:03.097136 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dbf5ca97-e3df-4f80-94f3-196d15104768-config-volume\") pod \"dbf5ca97-e3df-4f80-94f3-196d15104768\" (UID: \"dbf5ca97-e3df-4f80-94f3-196d15104768\") " Feb 17 10:00:03 crc kubenswrapper[4848]: I0217 10:00:03.097234 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dbf5ca97-e3df-4f80-94f3-196d15104768-secret-volume\") pod \"dbf5ca97-e3df-4f80-94f3-196d15104768\" (UID: \"dbf5ca97-e3df-4f80-94f3-196d15104768\") " Feb 17 10:00:03 crc kubenswrapper[4848]: I0217 10:00:03.098815 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbf5ca97-e3df-4f80-94f3-196d15104768-config-volume" (OuterVolumeSpecName: "config-volume") pod "dbf5ca97-e3df-4f80-94f3-196d15104768" (UID: "dbf5ca97-e3df-4f80-94f3-196d15104768"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 10:00:03 crc kubenswrapper[4848]: I0217 10:00:03.103701 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbf5ca97-e3df-4f80-94f3-196d15104768-kube-api-access-vnm75" (OuterVolumeSpecName: "kube-api-access-vnm75") pod "dbf5ca97-e3df-4f80-94f3-196d15104768" (UID: "dbf5ca97-e3df-4f80-94f3-196d15104768"). InnerVolumeSpecName "kube-api-access-vnm75". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 10:00:03 crc kubenswrapper[4848]: I0217 10:00:03.105241 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbf5ca97-e3df-4f80-94f3-196d15104768-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dbf5ca97-e3df-4f80-94f3-196d15104768" (UID: "dbf5ca97-e3df-4f80-94f3-196d15104768"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 10:00:03 crc kubenswrapper[4848]: I0217 10:00:03.198917 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnm75\" (UniqueName: \"kubernetes.io/projected/dbf5ca97-e3df-4f80-94f3-196d15104768-kube-api-access-vnm75\") on node \"crc\" DevicePath \"\"" Feb 17 10:00:03 crc kubenswrapper[4848]: I0217 10:00:03.199219 4848 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dbf5ca97-e3df-4f80-94f3-196d15104768-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 10:00:03 crc kubenswrapper[4848]: I0217 10:00:03.199228 4848 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dbf5ca97-e3df-4f80-94f3-196d15104768-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 10:00:03 crc kubenswrapper[4848]: I0217 10:00:03.691543 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522040-jfpmx" event={"ID":"dbf5ca97-e3df-4f80-94f3-196d15104768","Type":"ContainerDied","Data":"3526a80ad06e4a05ebf6fb217d7ba12e57892a4e5b6c48b5ebb364fb7ec84ee2"} Feb 17 10:00:03 crc kubenswrapper[4848]: I0217 10:00:03.691586 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3526a80ad06e4a05ebf6fb217d7ba12e57892a4e5b6c48b5ebb364fb7ec84ee2" Feb 17 10:00:03 crc kubenswrapper[4848]: I0217 10:00:03.691598 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522040-jfpmx" Feb 17 10:00:04 crc kubenswrapper[4848]: I0217 10:00:04.095950 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521995-jzx88"] Feb 17 10:00:04 crc kubenswrapper[4848]: I0217 10:00:04.106540 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29521995-jzx88"] Feb 17 10:00:05 crc kubenswrapper[4848]: I0217 10:00:05.396926 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98bce666-a802-42f1-9b9e-366d88c049ba" path="/var/lib/kubelet/pods/98bce666-a802-42f1-9b9e-366d88c049ba/volumes" Feb 17 10:00:05 crc kubenswrapper[4848]: I0217 10:00:05.426357 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-whbhm_08f85cef-d7cc-46c2-a1ff-ba22a9b098ab/cert-manager-controller/0.log" Feb 17 10:00:05 crc kubenswrapper[4848]: I0217 10:00:05.540348 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-8pgtk_4851dc21-51d9-4c87-a15c-4b7295155016/cert-manager-cainjector/0.log" Feb 17 10:00:05 crc kubenswrapper[4848]: I0217 10:00:05.620663 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-crrm4_f0048a92-b3fb-4c29-b58f-7013b68e1512/cert-manager-webhook/0.log" Feb 17 10:00:05 crc kubenswrapper[4848]: I0217 10:00:05.710176 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvvs9" event={"ID":"dadd5991-7ee3-4b01-8993-faa1b986013c","Type":"ContainerStarted","Data":"a41a402e2f6cef4dac8a9dadfcf6ea04b901c15ac5ce8635cfe0b7457077dc26"} Feb 17 10:00:12 crc kubenswrapper[4848]: I0217 10:00:12.784652 4848 generic.go:334] "Generic (PLEG): container finished" podID="dadd5991-7ee3-4b01-8993-faa1b986013c" containerID="a41a402e2f6cef4dac8a9dadfcf6ea04b901c15ac5ce8635cfe0b7457077dc26" exitCode=0 Feb 17 10:00:12 crc kubenswrapper[4848]: I0217 10:00:12.784719 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvvs9" event={"ID":"dadd5991-7ee3-4b01-8993-faa1b986013c","Type":"ContainerDied","Data":"a41a402e2f6cef4dac8a9dadfcf6ea04b901c15ac5ce8635cfe0b7457077dc26"} Feb 17 10:00:13 crc kubenswrapper[4848]: I0217 10:00:13.796570 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvvs9" event={"ID":"dadd5991-7ee3-4b01-8993-faa1b986013c","Type":"ContainerStarted","Data":"1fd4317f48b265d329980660f9b30ca5934a8555f20674688f070f7a2d524cfe"} Feb 17 10:00:13 crc kubenswrapper[4848]: I0217 10:00:13.822301 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jvvs9" podStartSLOduration=2.131568443 podStartE2EDuration="12.822284997s" podCreationTimestamp="2026-02-17 10:00:01 +0000 UTC" firstStartedPulling="2026-02-17 10:00:02.683754573 +0000 UTC m=+3280.227010219" lastFinishedPulling="2026-02-17 10:00:13.374471127 +0000 UTC m=+3290.917726773" observedRunningTime="2026-02-17 10:00:13.818124759 +0000 UTC m=+3291.361380425" watchObservedRunningTime="2026-02-17 10:00:13.822284997 +0000 UTC m=+3291.365540643" Feb 17 10:00:16 crc kubenswrapper[4848]: I0217 10:00:16.383878 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 10:00:16 crc kubenswrapper[4848]: E0217 10:00:16.384615 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:00:18 crc kubenswrapper[4848]: I0217 10:00:18.690566 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-x4n7r_413a0360-d8d4-427d-adbc-3d7914e54ea5/nmstate-console-plugin/0.log" Feb 17 10:00:18 crc kubenswrapper[4848]: I0217 10:00:18.904651 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-6fjkq_473852b6-e35d-4b9f-8b47-e55ccb774b93/nmstate-handler/0.log" Feb 17 10:00:18 crc kubenswrapper[4848]: I0217 10:00:18.976019 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-7txp2_989f6a1e-38ab-40a8-94aa-faadc620efca/kube-rbac-proxy/0.log" Feb 17 10:00:19 crc kubenswrapper[4848]: I0217 10:00:19.076129 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-7txp2_989f6a1e-38ab-40a8-94aa-faadc620efca/nmstate-metrics/0.log" Feb 17 10:00:19 crc kubenswrapper[4848]: I0217 10:00:19.121499 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-swpl6_c406a2fd-a4a9-47fb-bfff-80324dae94c4/nmstate-operator/0.log" Feb 17 10:00:19 crc kubenswrapper[4848]: I0217 10:00:19.285302 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-pl898_08ae32ff-43fc-4536-b68a-45e4fd947a2d/nmstate-webhook/0.log" Feb 17 10:00:21 crc kubenswrapper[4848]: I0217 10:00:21.747281 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jvvs9" Feb 17 10:00:21 crc kubenswrapper[4848]: I0217 10:00:21.747537 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jvvs9" Feb 17 10:00:21 crc kubenswrapper[4848]: I0217 10:00:21.792859 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jvvs9" Feb 17 10:00:21 crc kubenswrapper[4848]: I0217 10:00:21.952697 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jvvs9" Feb 17 10:00:22 crc kubenswrapper[4848]: I0217 10:00:22.037652 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jvvs9"] Feb 17 10:00:23 crc kubenswrapper[4848]: I0217 10:00:23.910744 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jvvs9" podUID="dadd5991-7ee3-4b01-8993-faa1b986013c" containerName="registry-server" containerID="cri-o://1fd4317f48b265d329980660f9b30ca5934a8555f20674688f070f7a2d524cfe" gracePeriod=2 Feb 17 10:00:24 crc kubenswrapper[4848]: I0217 10:00:24.384385 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvvs9" Feb 17 10:00:24 crc kubenswrapper[4848]: I0217 10:00:24.446039 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dadd5991-7ee3-4b01-8993-faa1b986013c-catalog-content\") pod \"dadd5991-7ee3-4b01-8993-faa1b986013c\" (UID: \"dadd5991-7ee3-4b01-8993-faa1b986013c\") " Feb 17 10:00:24 crc kubenswrapper[4848]: I0217 10:00:24.446117 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxbk7\" (UniqueName: \"kubernetes.io/projected/dadd5991-7ee3-4b01-8993-faa1b986013c-kube-api-access-qxbk7\") pod \"dadd5991-7ee3-4b01-8993-faa1b986013c\" (UID: \"dadd5991-7ee3-4b01-8993-faa1b986013c\") " Feb 17 10:00:24 crc kubenswrapper[4848]: I0217 10:00:24.446232 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dadd5991-7ee3-4b01-8993-faa1b986013c-utilities\") pod \"dadd5991-7ee3-4b01-8993-faa1b986013c\" (UID: \"dadd5991-7ee3-4b01-8993-faa1b986013c\") " Feb 17 10:00:24 crc kubenswrapper[4848]: I0217 10:00:24.447125 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dadd5991-7ee3-4b01-8993-faa1b986013c-utilities" (OuterVolumeSpecName: "utilities") pod "dadd5991-7ee3-4b01-8993-faa1b986013c" (UID: "dadd5991-7ee3-4b01-8993-faa1b986013c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 10:00:24 crc kubenswrapper[4848]: I0217 10:00:24.472251 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dadd5991-7ee3-4b01-8993-faa1b986013c-kube-api-access-qxbk7" (OuterVolumeSpecName: "kube-api-access-qxbk7") pod "dadd5991-7ee3-4b01-8993-faa1b986013c" (UID: "dadd5991-7ee3-4b01-8993-faa1b986013c"). InnerVolumeSpecName "kube-api-access-qxbk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 10:00:24 crc kubenswrapper[4848]: I0217 10:00:24.548391 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxbk7\" (UniqueName: \"kubernetes.io/projected/dadd5991-7ee3-4b01-8993-faa1b986013c-kube-api-access-qxbk7\") on node \"crc\" DevicePath \"\"" Feb 17 10:00:24 crc kubenswrapper[4848]: I0217 10:00:24.548606 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dadd5991-7ee3-4b01-8993-faa1b986013c-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 10:00:24 crc kubenswrapper[4848]: I0217 10:00:24.563552 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dadd5991-7ee3-4b01-8993-faa1b986013c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dadd5991-7ee3-4b01-8993-faa1b986013c" (UID: "dadd5991-7ee3-4b01-8993-faa1b986013c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 10:00:24 crc kubenswrapper[4848]: I0217 10:00:24.650332 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dadd5991-7ee3-4b01-8993-faa1b986013c-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 10:00:24 crc kubenswrapper[4848]: I0217 10:00:24.930488 4848 generic.go:334] "Generic (PLEG): container finished" podID="dadd5991-7ee3-4b01-8993-faa1b986013c" containerID="1fd4317f48b265d329980660f9b30ca5934a8555f20674688f070f7a2d524cfe" exitCode=0 Feb 17 10:00:24 crc kubenswrapper[4848]: I0217 10:00:24.930533 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvvs9" event={"ID":"dadd5991-7ee3-4b01-8993-faa1b986013c","Type":"ContainerDied","Data":"1fd4317f48b265d329980660f9b30ca5934a8555f20674688f070f7a2d524cfe"} Feb 17 10:00:24 crc kubenswrapper[4848]: I0217 10:00:24.930557 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jvvs9" Feb 17 10:00:24 crc kubenswrapper[4848]: I0217 10:00:24.930580 4848 scope.go:117] "RemoveContainer" containerID="1fd4317f48b265d329980660f9b30ca5934a8555f20674688f070f7a2d524cfe" Feb 17 10:00:24 crc kubenswrapper[4848]: I0217 10:00:24.930564 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jvvs9" event={"ID":"dadd5991-7ee3-4b01-8993-faa1b986013c","Type":"ContainerDied","Data":"0f95c152e7ef5a4e1cdd557651466e21488a27c5b6fe18267373bc379b1fedb4"} Feb 17 10:00:24 crc kubenswrapper[4848]: I0217 10:00:24.958668 4848 scope.go:117] "RemoveContainer" containerID="a41a402e2f6cef4dac8a9dadfcf6ea04b901c15ac5ce8635cfe0b7457077dc26" Feb 17 10:00:24 crc kubenswrapper[4848]: I0217 10:00:24.976365 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jvvs9"] Feb 17 10:00:24 crc kubenswrapper[4848]: I0217 10:00:24.986337 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jvvs9"] Feb 17 10:00:25 crc kubenswrapper[4848]: I0217 10:00:25.002156 4848 scope.go:117] "RemoveContainer" containerID="f676d67f635d987fcbc53a28128cb5af727f4317333949615d8db290248106da" Feb 17 10:00:25 crc kubenswrapper[4848]: I0217 10:00:25.048959 4848 scope.go:117] "RemoveContainer" containerID="1fd4317f48b265d329980660f9b30ca5934a8555f20674688f070f7a2d524cfe" Feb 17 10:00:25 crc kubenswrapper[4848]: E0217 10:00:25.050158 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fd4317f48b265d329980660f9b30ca5934a8555f20674688f070f7a2d524cfe\": container with ID starting with 1fd4317f48b265d329980660f9b30ca5934a8555f20674688f070f7a2d524cfe not found: ID does not exist" containerID="1fd4317f48b265d329980660f9b30ca5934a8555f20674688f070f7a2d524cfe" Feb 17 10:00:25 crc kubenswrapper[4848]: I0217 10:00:25.050187 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fd4317f48b265d329980660f9b30ca5934a8555f20674688f070f7a2d524cfe"} err="failed to get container status \"1fd4317f48b265d329980660f9b30ca5934a8555f20674688f070f7a2d524cfe\": rpc error: code = NotFound desc = could not find container \"1fd4317f48b265d329980660f9b30ca5934a8555f20674688f070f7a2d524cfe\": container with ID starting with 1fd4317f48b265d329980660f9b30ca5934a8555f20674688f070f7a2d524cfe not found: ID does not exist" Feb 17 10:00:25 crc kubenswrapper[4848]: I0217 10:00:25.050210 4848 scope.go:117] "RemoveContainer" containerID="a41a402e2f6cef4dac8a9dadfcf6ea04b901c15ac5ce8635cfe0b7457077dc26" Feb 17 10:00:25 crc kubenswrapper[4848]: E0217 10:00:25.053997 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a41a402e2f6cef4dac8a9dadfcf6ea04b901c15ac5ce8635cfe0b7457077dc26\": container with ID starting with a41a402e2f6cef4dac8a9dadfcf6ea04b901c15ac5ce8635cfe0b7457077dc26 not found: ID does not exist" containerID="a41a402e2f6cef4dac8a9dadfcf6ea04b901c15ac5ce8635cfe0b7457077dc26" Feb 17 10:00:25 crc kubenswrapper[4848]: I0217 10:00:25.054023 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a41a402e2f6cef4dac8a9dadfcf6ea04b901c15ac5ce8635cfe0b7457077dc26"} err="failed to get container status \"a41a402e2f6cef4dac8a9dadfcf6ea04b901c15ac5ce8635cfe0b7457077dc26\": rpc error: code = NotFound desc = could not find container \"a41a402e2f6cef4dac8a9dadfcf6ea04b901c15ac5ce8635cfe0b7457077dc26\": container with ID starting with a41a402e2f6cef4dac8a9dadfcf6ea04b901c15ac5ce8635cfe0b7457077dc26 not found: ID does not exist" Feb 17 10:00:25 crc kubenswrapper[4848]: I0217 10:00:25.054043 4848 scope.go:117] "RemoveContainer" containerID="f676d67f635d987fcbc53a28128cb5af727f4317333949615d8db290248106da" Feb 17 10:00:25 crc kubenswrapper[4848]: E0217 10:00:25.054528 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f676d67f635d987fcbc53a28128cb5af727f4317333949615d8db290248106da\": container with ID starting with f676d67f635d987fcbc53a28128cb5af727f4317333949615d8db290248106da not found: ID does not exist" containerID="f676d67f635d987fcbc53a28128cb5af727f4317333949615d8db290248106da" Feb 17 10:00:25 crc kubenswrapper[4848]: I0217 10:00:25.054546 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f676d67f635d987fcbc53a28128cb5af727f4317333949615d8db290248106da"} err="failed to get container status \"f676d67f635d987fcbc53a28128cb5af727f4317333949615d8db290248106da\": rpc error: code = NotFound desc = could not find container \"f676d67f635d987fcbc53a28128cb5af727f4317333949615d8db290248106da\": container with ID starting with f676d67f635d987fcbc53a28128cb5af727f4317333949615d8db290248106da not found: ID does not exist" Feb 17 10:00:25 crc kubenswrapper[4848]: I0217 10:00:25.397286 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dadd5991-7ee3-4b01-8993-faa1b986013c" path="/var/lib/kubelet/pods/dadd5991-7ee3-4b01-8993-faa1b986013c/volumes" Feb 17 10:00:29 crc kubenswrapper[4848]: I0217 10:00:29.143068 4848 scope.go:117] "RemoveContainer" containerID="a5e852f666ccc30caf2695740c63d0c07c0a4db89fe690eca1fe310e104f6011" Feb 17 10:00:30 crc kubenswrapper[4848]: I0217 10:00:30.383680 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 10:00:30 crc kubenswrapper[4848]: E0217 10:00:30.384016 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:00:44 crc kubenswrapper[4848]: I0217 10:00:44.384817 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 10:00:44 crc kubenswrapper[4848]: E0217 10:00:44.385563 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:00:46 crc kubenswrapper[4848]: I0217 10:00:46.639321 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-7cbd2_8ff272d7-4c99-464c-819d-b7b22fc8be06/kube-rbac-proxy/0.log" Feb 17 10:00:46 crc kubenswrapper[4848]: I0217 10:00:46.820861 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-7cbd2_8ff272d7-4c99-464c-819d-b7b22fc8be06/controller/0.log" Feb 17 10:00:46 crc kubenswrapper[4848]: I0217 10:00:46.892982 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/cp-frr-files/0.log" Feb 17 10:00:47 crc kubenswrapper[4848]: I0217 10:00:47.100654 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/cp-frr-files/0.log" Feb 17 10:00:47 crc kubenswrapper[4848]: I0217 10:00:47.119619 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/cp-reloader/0.log" Feb 17 10:00:47 crc kubenswrapper[4848]: I0217 10:00:47.126652 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/cp-metrics/0.log" Feb 17 10:00:47 crc kubenswrapper[4848]: I0217 10:00:47.136379 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/cp-reloader/0.log" Feb 17 10:00:47 crc kubenswrapper[4848]: I0217 10:00:47.285886 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/cp-frr-files/0.log" Feb 17 10:00:47 crc kubenswrapper[4848]: I0217 10:00:47.297746 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/cp-reloader/0.log" Feb 17 10:00:47 crc kubenswrapper[4848]: I0217 10:00:47.307042 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/cp-metrics/0.log" Feb 17 10:00:47 crc kubenswrapper[4848]: I0217 10:00:47.334450 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/cp-metrics/0.log" Feb 17 10:00:47 crc kubenswrapper[4848]: I0217 10:00:47.543441 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/cp-metrics/0.log" Feb 17 10:00:47 crc kubenswrapper[4848]: I0217 10:00:47.570659 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/cp-reloader/0.log" Feb 17 10:00:47 crc kubenswrapper[4848]: I0217 10:00:47.571457 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/cp-frr-files/0.log" Feb 17 10:00:47 crc kubenswrapper[4848]: I0217 10:00:47.582586 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/controller/0.log" Feb 17 10:00:47 crc kubenswrapper[4848]: I0217 10:00:47.775472 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/kube-rbac-proxy/0.log" Feb 17 10:00:47 crc kubenswrapper[4848]: I0217 10:00:47.786873 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/frr-metrics/0.log" Feb 17 10:00:47 crc kubenswrapper[4848]: I0217 10:00:47.787286 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/kube-rbac-proxy-frr/0.log" Feb 17 10:00:48 crc kubenswrapper[4848]: I0217 10:00:48.016171 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-5l7zm_c229235f-b879-43bc-9b19-b4196264d1ec/frr-k8s-webhook-server/0.log" Feb 17 10:00:48 crc kubenswrapper[4848]: I0217 10:00:48.019509 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/reloader/0.log" Feb 17 10:00:48 crc kubenswrapper[4848]: I0217 10:00:48.344899 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6df4786bd-895gn_054a38ba-b80d-44df-b84a-e5e3b9847df3/manager/0.log" Feb 17 10:00:48 crc kubenswrapper[4848]: I0217 10:00:48.465976 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7f987958c8-pm7pp_ccc28183-efb5-4673-8268-44ed1ced4cb7/webhook-server/0.log" Feb 17 10:00:48 crc kubenswrapper[4848]: I0217 10:00:48.555357 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lmz9q_36953889-0f59-4c5e-a666-c80389e18bf8/kube-rbac-proxy/0.log" Feb 17 10:00:49 crc kubenswrapper[4848]: I0217 10:00:49.017809 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/frr/0.log" Feb 17 10:00:49 crc kubenswrapper[4848]: I0217 10:00:49.187269 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lmz9q_36953889-0f59-4c5e-a666-c80389e18bf8/speaker/0.log" Feb 17 10:00:59 crc kubenswrapper[4848]: I0217 10:00:59.383518 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 10:00:59 crc kubenswrapper[4848]: E0217 10:00:59.384370 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:01:00 crc kubenswrapper[4848]: I0217 10:01:00.161338 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29522041-h9qqk"] Feb 17 10:01:00 crc kubenswrapper[4848]: E0217 10:01:00.162076 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dadd5991-7ee3-4b01-8993-faa1b986013c" containerName="registry-server" Feb 17 10:01:00 crc kubenswrapper[4848]: I0217 10:01:00.162102 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="dadd5991-7ee3-4b01-8993-faa1b986013c" containerName="registry-server" Feb 17 10:01:00 crc kubenswrapper[4848]: E0217 10:01:00.162118 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dadd5991-7ee3-4b01-8993-faa1b986013c" containerName="extract-utilities" Feb 17 10:01:00 crc kubenswrapper[4848]: I0217 10:01:00.162126 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="dadd5991-7ee3-4b01-8993-faa1b986013c" containerName="extract-utilities" Feb 17 10:01:00 crc kubenswrapper[4848]: E0217 10:01:00.162138 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbf5ca97-e3df-4f80-94f3-196d15104768" containerName="collect-profiles" Feb 17 10:01:00 crc kubenswrapper[4848]: I0217 10:01:00.162146 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbf5ca97-e3df-4f80-94f3-196d15104768" containerName="collect-profiles" Feb 17 10:01:00 crc kubenswrapper[4848]: E0217 10:01:00.162168 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dadd5991-7ee3-4b01-8993-faa1b986013c" containerName="extract-content" Feb 17 10:01:00 crc kubenswrapper[4848]: I0217 10:01:00.162175 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="dadd5991-7ee3-4b01-8993-faa1b986013c" containerName="extract-content" Feb 17 10:01:00 crc kubenswrapper[4848]: I0217 10:01:00.162427 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="dadd5991-7ee3-4b01-8993-faa1b986013c" containerName="registry-server" Feb 17 10:01:00 crc kubenswrapper[4848]: I0217 10:01:00.162458 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbf5ca97-e3df-4f80-94f3-196d15104768" containerName="collect-profiles" Feb 17 10:01:00 crc kubenswrapper[4848]: I0217 10:01:00.163713 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29522041-h9qqk" Feb 17 10:01:00 crc kubenswrapper[4848]: I0217 10:01:00.179086 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29522041-h9qqk"] Feb 17 10:01:00 crc kubenswrapper[4848]: I0217 10:01:00.282127 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5bcdc424-dcde-46e9-ac56-fe1b5330844c-fernet-keys\") pod \"keystone-cron-29522041-h9qqk\" (UID: \"5bcdc424-dcde-46e9-ac56-fe1b5330844c\") " pod="openstack/keystone-cron-29522041-h9qqk" Feb 17 10:01:00 crc kubenswrapper[4848]: I0217 10:01:00.282183 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bcdc424-dcde-46e9-ac56-fe1b5330844c-config-data\") pod \"keystone-cron-29522041-h9qqk\" (UID: \"5bcdc424-dcde-46e9-ac56-fe1b5330844c\") " pod="openstack/keystone-cron-29522041-h9qqk" Feb 17 10:01:00 crc kubenswrapper[4848]: I0217 10:01:00.282207 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnzpk\" (UniqueName: \"kubernetes.io/projected/5bcdc424-dcde-46e9-ac56-fe1b5330844c-kube-api-access-qnzpk\") pod \"keystone-cron-29522041-h9qqk\" (UID: \"5bcdc424-dcde-46e9-ac56-fe1b5330844c\") " pod="openstack/keystone-cron-29522041-h9qqk" Feb 17 10:01:00 crc kubenswrapper[4848]: I0217 10:01:00.282321 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bcdc424-dcde-46e9-ac56-fe1b5330844c-combined-ca-bundle\") pod \"keystone-cron-29522041-h9qqk\" (UID: \"5bcdc424-dcde-46e9-ac56-fe1b5330844c\") " pod="openstack/keystone-cron-29522041-h9qqk" Feb 17 10:01:00 crc kubenswrapper[4848]: I0217 10:01:00.384262 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5bcdc424-dcde-46e9-ac56-fe1b5330844c-fernet-keys\") pod \"keystone-cron-29522041-h9qqk\" (UID: \"5bcdc424-dcde-46e9-ac56-fe1b5330844c\") " pod="openstack/keystone-cron-29522041-h9qqk" Feb 17 10:01:00 crc kubenswrapper[4848]: I0217 10:01:00.385076 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bcdc424-dcde-46e9-ac56-fe1b5330844c-config-data\") pod \"keystone-cron-29522041-h9qqk\" (UID: \"5bcdc424-dcde-46e9-ac56-fe1b5330844c\") " pod="openstack/keystone-cron-29522041-h9qqk" Feb 17 10:01:00 crc kubenswrapper[4848]: I0217 10:01:00.385198 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnzpk\" (UniqueName: \"kubernetes.io/projected/5bcdc424-dcde-46e9-ac56-fe1b5330844c-kube-api-access-qnzpk\") pod \"keystone-cron-29522041-h9qqk\" (UID: \"5bcdc424-dcde-46e9-ac56-fe1b5330844c\") " pod="openstack/keystone-cron-29522041-h9qqk" Feb 17 10:01:00 crc kubenswrapper[4848]: I0217 10:01:00.385337 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bcdc424-dcde-46e9-ac56-fe1b5330844c-combined-ca-bundle\") pod \"keystone-cron-29522041-h9qqk\" (UID: \"5bcdc424-dcde-46e9-ac56-fe1b5330844c\") " pod="openstack/keystone-cron-29522041-h9qqk" Feb 17 10:01:00 crc kubenswrapper[4848]: I0217 10:01:00.390414 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5bcdc424-dcde-46e9-ac56-fe1b5330844c-fernet-keys\") pod \"keystone-cron-29522041-h9qqk\" (UID: \"5bcdc424-dcde-46e9-ac56-fe1b5330844c\") " pod="openstack/keystone-cron-29522041-h9qqk" Feb 17 10:01:00 crc kubenswrapper[4848]: I0217 10:01:00.390502 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bcdc424-dcde-46e9-ac56-fe1b5330844c-combined-ca-bundle\") pod \"keystone-cron-29522041-h9qqk\" (UID: \"5bcdc424-dcde-46e9-ac56-fe1b5330844c\") " pod="openstack/keystone-cron-29522041-h9qqk" Feb 17 10:01:00 crc kubenswrapper[4848]: I0217 10:01:00.390890 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bcdc424-dcde-46e9-ac56-fe1b5330844c-config-data\") pod \"keystone-cron-29522041-h9qqk\" (UID: \"5bcdc424-dcde-46e9-ac56-fe1b5330844c\") " pod="openstack/keystone-cron-29522041-h9qqk" Feb 17 10:01:00 crc kubenswrapper[4848]: I0217 10:01:00.411692 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnzpk\" (UniqueName: \"kubernetes.io/projected/5bcdc424-dcde-46e9-ac56-fe1b5330844c-kube-api-access-qnzpk\") pod \"keystone-cron-29522041-h9qqk\" (UID: \"5bcdc424-dcde-46e9-ac56-fe1b5330844c\") " pod="openstack/keystone-cron-29522041-h9qqk" Feb 17 10:01:00 crc kubenswrapper[4848]: I0217 10:01:00.492998 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29522041-h9qqk" Feb 17 10:01:00 crc kubenswrapper[4848]: I0217 10:01:00.982316 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29522041-h9qqk"] Feb 17 10:01:01 crc kubenswrapper[4848]: I0217 10:01:01.241357 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29522041-h9qqk" event={"ID":"5bcdc424-dcde-46e9-ac56-fe1b5330844c","Type":"ContainerStarted","Data":"52548f8f74876035b16d9ababac16b4021defb3a682eaa9be884286f2c229160"} Feb 17 10:01:01 crc kubenswrapper[4848]: I0217 10:01:01.241400 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29522041-h9qqk" event={"ID":"5bcdc424-dcde-46e9-ac56-fe1b5330844c","Type":"ContainerStarted","Data":"60c5b2d1a5f16baf06b7dc12e3c625ecf583a59ba3743790bec4e4ad1941d53a"} Feb 17 10:01:01 crc kubenswrapper[4848]: I0217 10:01:01.265676 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29522041-h9qqk" podStartSLOduration=1.265654554 podStartE2EDuration="1.265654554s" podCreationTimestamp="2026-02-17 10:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 10:01:01.256638068 +0000 UTC m=+3338.799893724" watchObservedRunningTime="2026-02-17 10:01:01.265654554 +0000 UTC m=+3338.808910200" Feb 17 10:01:01 crc kubenswrapper[4848]: I0217 10:01:01.680659 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb_24a91401-07a6-41bf-ad5b-fa2b8f60a52f/util/0.log" Feb 17 10:01:01 crc kubenswrapper[4848]: I0217 10:01:01.977671 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb_24a91401-07a6-41bf-ad5b-fa2b8f60a52f/pull/0.log" Feb 17 10:01:01 crc kubenswrapper[4848]: I0217 10:01:01.992959 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb_24a91401-07a6-41bf-ad5b-fa2b8f60a52f/pull/0.log" Feb 17 10:01:02 crc kubenswrapper[4848]: I0217 10:01:02.001151 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb_24a91401-07a6-41bf-ad5b-fa2b8f60a52f/util/0.log" Feb 17 10:01:02 crc kubenswrapper[4848]: I0217 10:01:02.152448 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb_24a91401-07a6-41bf-ad5b-fa2b8f60a52f/util/0.log" Feb 17 10:01:02 crc kubenswrapper[4848]: I0217 10:01:02.168686 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb_24a91401-07a6-41bf-ad5b-fa2b8f60a52f/extract/0.log" Feb 17 10:01:02 crc kubenswrapper[4848]: I0217 10:01:02.183794 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb_24a91401-07a6-41bf-ad5b-fa2b8f60a52f/pull/0.log" Feb 17 10:01:02 crc kubenswrapper[4848]: I0217 10:01:02.325623 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-thxdc_42e5e859-f703-486e-9d9b-06660c9d3e51/extract-utilities/0.log" Feb 17 10:01:02 crc kubenswrapper[4848]: I0217 10:01:02.517189 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-thxdc_42e5e859-f703-486e-9d9b-06660c9d3e51/extract-utilities/0.log" Feb 17 10:01:02 crc kubenswrapper[4848]: I0217 10:01:02.520794 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-thxdc_42e5e859-f703-486e-9d9b-06660c9d3e51/extract-content/0.log" Feb 17 10:01:02 crc kubenswrapper[4848]: I0217 10:01:02.522503 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-thxdc_42e5e859-f703-486e-9d9b-06660c9d3e51/extract-content/0.log" Feb 17 10:01:02 crc kubenswrapper[4848]: I0217 10:01:02.644700 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-thxdc_42e5e859-f703-486e-9d9b-06660c9d3e51/extract-utilities/0.log" Feb 17 10:01:02 crc kubenswrapper[4848]: I0217 10:01:02.706332 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-thxdc_42e5e859-f703-486e-9d9b-06660c9d3e51/extract-content/0.log" Feb 17 10:01:02 crc kubenswrapper[4848]: I0217 10:01:02.957655 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d4ccz_f36cf8f9-f4fd-456f-80ce-c4d68f71273b/extract-utilities/0.log" Feb 17 10:01:03 crc kubenswrapper[4848]: I0217 10:01:03.132232 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d4ccz_f36cf8f9-f4fd-456f-80ce-c4d68f71273b/extract-utilities/0.log" Feb 17 10:01:03 crc kubenswrapper[4848]: I0217 10:01:03.133726 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-thxdc_42e5e859-f703-486e-9d9b-06660c9d3e51/registry-server/0.log" Feb 17 10:01:03 crc kubenswrapper[4848]: I0217 10:01:03.182752 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d4ccz_f36cf8f9-f4fd-456f-80ce-c4d68f71273b/extract-content/0.log" Feb 17 10:01:03 crc kubenswrapper[4848]: I0217 10:01:03.215961 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d4ccz_f36cf8f9-f4fd-456f-80ce-c4d68f71273b/extract-content/0.log" Feb 17 10:01:03 crc kubenswrapper[4848]: I0217 10:01:03.497298 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d4ccz_f36cf8f9-f4fd-456f-80ce-c4d68f71273b/extract-content/0.log" Feb 17 10:01:03 crc kubenswrapper[4848]: I0217 10:01:03.618060 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d4ccz_f36cf8f9-f4fd-456f-80ce-c4d68f71273b/extract-utilities/0.log" Feb 17 10:01:03 crc kubenswrapper[4848]: I0217 10:01:03.735215 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28_8eb1e57e-4c70-41bf-a650-989b432ce3b6/util/0.log" Feb 17 10:01:03 crc kubenswrapper[4848]: I0217 10:01:03.961353 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28_8eb1e57e-4c70-41bf-a650-989b432ce3b6/util/0.log" Feb 17 10:01:04 crc kubenswrapper[4848]: I0217 10:01:04.002633 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28_8eb1e57e-4c70-41bf-a650-989b432ce3b6/pull/0.log" Feb 17 10:01:04 crc kubenswrapper[4848]: I0217 10:01:04.011022 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28_8eb1e57e-4c70-41bf-a650-989b432ce3b6/pull/0.log" Feb 17 10:01:04 crc kubenswrapper[4848]: I0217 10:01:04.094050 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d4ccz_f36cf8f9-f4fd-456f-80ce-c4d68f71273b/registry-server/0.log" Feb 17 10:01:04 crc kubenswrapper[4848]: I0217 10:01:04.181138 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28_8eb1e57e-4c70-41bf-a650-989b432ce3b6/util/0.log" Feb 17 10:01:04 crc kubenswrapper[4848]: I0217 10:01:04.206012 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28_8eb1e57e-4c70-41bf-a650-989b432ce3b6/pull/0.log" Feb 17 10:01:04 crc kubenswrapper[4848]: I0217 10:01:04.240345 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28_8eb1e57e-4c70-41bf-a650-989b432ce3b6/extract/0.log" Feb 17 10:01:04 crc kubenswrapper[4848]: I0217 10:01:04.265519 4848 generic.go:334] "Generic (PLEG): container finished" podID="5bcdc424-dcde-46e9-ac56-fe1b5330844c" containerID="52548f8f74876035b16d9ababac16b4021defb3a682eaa9be884286f2c229160" exitCode=0 Feb 17 10:01:04 crc kubenswrapper[4848]: I0217 10:01:04.265573 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29522041-h9qqk" event={"ID":"5bcdc424-dcde-46e9-ac56-fe1b5330844c","Type":"ContainerDied","Data":"52548f8f74876035b16d9ababac16b4021defb3a682eaa9be884286f2c229160"} Feb 17 10:01:04 crc kubenswrapper[4848]: I0217 10:01:04.411070 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xvh8l_cd7df0ba-ff8c-48ce-ad07-8ac50003f318/marketplace-operator/0.log" Feb 17 10:01:04 crc kubenswrapper[4848]: I0217 10:01:04.479999 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnl45_dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01/extract-utilities/0.log" Feb 17 10:01:04 crc kubenswrapper[4848]: I0217 10:01:04.650612 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnl45_dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01/extract-utilities/0.log" Feb 17 10:01:04 crc kubenswrapper[4848]: I0217 10:01:04.698367 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnl45_dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01/extract-content/0.log" Feb 17 10:01:04 crc kubenswrapper[4848]: I0217 10:01:04.713484 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnl45_dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01/extract-content/0.log" Feb 17 10:01:04 crc kubenswrapper[4848]: I0217 10:01:04.860095 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnl45_dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01/extract-content/0.log" Feb 17 10:01:04 crc kubenswrapper[4848]: I0217 10:01:04.900033 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnl45_dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01/extract-utilities/0.log" Feb 17 10:01:04 crc kubenswrapper[4848]: I0217 10:01:04.953093 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnl45_dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01/registry-server/0.log" Feb 17 10:01:05 crc kubenswrapper[4848]: I0217 10:01:05.065321 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m54wv_31b2644d-739b-4457-a3cc-c30d6b116423/extract-utilities/0.log" Feb 17 10:01:05 crc kubenswrapper[4848]: I0217 10:01:05.197976 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m54wv_31b2644d-739b-4457-a3cc-c30d6b116423/extract-utilities/0.log" Feb 17 10:01:05 crc kubenswrapper[4848]: I0217 10:01:05.246042 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m54wv_31b2644d-739b-4457-a3cc-c30d6b116423/extract-content/0.log" Feb 17 10:01:05 crc kubenswrapper[4848]: I0217 10:01:05.278135 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m54wv_31b2644d-739b-4457-a3cc-c30d6b116423/extract-content/0.log" Feb 17 10:01:05 crc kubenswrapper[4848]: I0217 10:01:05.525209 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m54wv_31b2644d-739b-4457-a3cc-c30d6b116423/extract-content/0.log" Feb 17 10:01:05 crc kubenswrapper[4848]: I0217 10:01:05.542319 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m54wv_31b2644d-739b-4457-a3cc-c30d6b116423/extract-utilities/0.log" Feb 17 10:01:05 crc kubenswrapper[4848]: I0217 10:01:05.670831 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29522041-h9qqk" Feb 17 10:01:05 crc kubenswrapper[4848]: I0217 10:01:05.792175 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bcdc424-dcde-46e9-ac56-fe1b5330844c-config-data\") pod \"5bcdc424-dcde-46e9-ac56-fe1b5330844c\" (UID: \"5bcdc424-dcde-46e9-ac56-fe1b5330844c\") " Feb 17 10:01:05 crc kubenswrapper[4848]: I0217 10:01:05.792230 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnzpk\" (UniqueName: \"kubernetes.io/projected/5bcdc424-dcde-46e9-ac56-fe1b5330844c-kube-api-access-qnzpk\") pod \"5bcdc424-dcde-46e9-ac56-fe1b5330844c\" (UID: \"5bcdc424-dcde-46e9-ac56-fe1b5330844c\") " Feb 17 10:01:05 crc kubenswrapper[4848]: I0217 10:01:05.792273 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bcdc424-dcde-46e9-ac56-fe1b5330844c-combined-ca-bundle\") pod \"5bcdc424-dcde-46e9-ac56-fe1b5330844c\" (UID: \"5bcdc424-dcde-46e9-ac56-fe1b5330844c\") " Feb 17 10:01:05 crc kubenswrapper[4848]: I0217 10:01:05.792435 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5bcdc424-dcde-46e9-ac56-fe1b5330844c-fernet-keys\") pod \"5bcdc424-dcde-46e9-ac56-fe1b5330844c\" (UID: \"5bcdc424-dcde-46e9-ac56-fe1b5330844c\") " Feb 17 10:01:05 crc kubenswrapper[4848]: I0217 10:01:05.804041 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bcdc424-dcde-46e9-ac56-fe1b5330844c-kube-api-access-qnzpk" (OuterVolumeSpecName: "kube-api-access-qnzpk") pod "5bcdc424-dcde-46e9-ac56-fe1b5330844c" (UID: "5bcdc424-dcde-46e9-ac56-fe1b5330844c"). InnerVolumeSpecName "kube-api-access-qnzpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 10:01:05 crc kubenswrapper[4848]: I0217 10:01:05.815965 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bcdc424-dcde-46e9-ac56-fe1b5330844c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5bcdc424-dcde-46e9-ac56-fe1b5330844c" (UID: "5bcdc424-dcde-46e9-ac56-fe1b5330844c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 10:01:05 crc kubenswrapper[4848]: I0217 10:01:05.835701 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bcdc424-dcde-46e9-ac56-fe1b5330844c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5bcdc424-dcde-46e9-ac56-fe1b5330844c" (UID: "5bcdc424-dcde-46e9-ac56-fe1b5330844c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 10:01:05 crc kubenswrapper[4848]: I0217 10:01:05.888717 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bcdc424-dcde-46e9-ac56-fe1b5330844c-config-data" (OuterVolumeSpecName: "config-data") pod "5bcdc424-dcde-46e9-ac56-fe1b5330844c" (UID: "5bcdc424-dcde-46e9-ac56-fe1b5330844c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 10:01:05 crc kubenswrapper[4848]: I0217 10:01:05.895316 4848 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bcdc424-dcde-46e9-ac56-fe1b5330844c-config-data\") on node \"crc\" DevicePath \"\"" Feb 17 10:01:05 crc kubenswrapper[4848]: I0217 10:01:05.895347 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnzpk\" (UniqueName: \"kubernetes.io/projected/5bcdc424-dcde-46e9-ac56-fe1b5330844c-kube-api-access-qnzpk\") on node \"crc\" DevicePath \"\"" Feb 17 10:01:05 crc kubenswrapper[4848]: I0217 10:01:05.895356 4848 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bcdc424-dcde-46e9-ac56-fe1b5330844c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 17 10:01:05 crc kubenswrapper[4848]: I0217 10:01:05.895366 4848 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5bcdc424-dcde-46e9-ac56-fe1b5330844c-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 17 10:01:05 crc kubenswrapper[4848]: I0217 10:01:05.992029 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m54wv_31b2644d-739b-4457-a3cc-c30d6b116423/registry-server/0.log" Feb 17 10:01:06 crc kubenswrapper[4848]: I0217 10:01:06.284878 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29522041-h9qqk" event={"ID":"5bcdc424-dcde-46e9-ac56-fe1b5330844c","Type":"ContainerDied","Data":"60c5b2d1a5f16baf06b7dc12e3c625ecf583a59ba3743790bec4e4ad1941d53a"} Feb 17 10:01:06 crc kubenswrapper[4848]: I0217 10:01:06.285233 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60c5b2d1a5f16baf06b7dc12e3c625ecf583a59ba3743790bec4e4ad1941d53a" Feb 17 10:01:06 crc kubenswrapper[4848]: I0217 10:01:06.284962 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29522041-h9qqk" Feb 17 10:01:10 crc kubenswrapper[4848]: I0217 10:01:10.383992 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 10:01:10 crc kubenswrapper[4848]: E0217 10:01:10.384716 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:01:23 crc kubenswrapper[4848]: I0217 10:01:23.416170 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 10:01:23 crc kubenswrapper[4848]: E0217 10:01:23.418116 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:01:35 crc kubenswrapper[4848]: I0217 10:01:35.384092 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 10:01:35 crc kubenswrapper[4848]: E0217 10:01:35.385269 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:01:49 crc kubenswrapper[4848]: I0217 10:01:49.386492 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 10:01:49 crc kubenswrapper[4848]: E0217 10:01:49.389565 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:02:02 crc kubenswrapper[4848]: I0217 10:02:02.383857 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 10:02:02 crc kubenswrapper[4848]: E0217 10:02:02.384635 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:02:13 crc kubenswrapper[4848]: I0217 10:02:13.390122 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 10:02:13 crc kubenswrapper[4848]: E0217 10:02:13.399571 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:02:24 crc kubenswrapper[4848]: I0217 10:02:24.385119 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 10:02:25 crc kubenswrapper[4848]: I0217 10:02:25.099023 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerStarted","Data":"bf1fbd7c833a9345e13be53113b338550eebd27fd5bf7c6bcfec90cfb0ca3555"} Feb 17 10:02:35 crc kubenswrapper[4848]: I0217 10:02:35.595536 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xlzjz"] Feb 17 10:02:35 crc kubenswrapper[4848]: E0217 10:02:35.597272 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bcdc424-dcde-46e9-ac56-fe1b5330844c" containerName="keystone-cron" Feb 17 10:02:35 crc kubenswrapper[4848]: I0217 10:02:35.597293 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bcdc424-dcde-46e9-ac56-fe1b5330844c" containerName="keystone-cron" Feb 17 10:02:35 crc kubenswrapper[4848]: I0217 10:02:35.597537 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bcdc424-dcde-46e9-ac56-fe1b5330844c" containerName="keystone-cron" Feb 17 10:02:35 crc kubenswrapper[4848]: I0217 10:02:35.600519 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xlzjz" Feb 17 10:02:35 crc kubenswrapper[4848]: I0217 10:02:35.605704 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xlzjz"] Feb 17 10:02:35 crc kubenswrapper[4848]: I0217 10:02:35.706306 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8szh\" (UniqueName: \"kubernetes.io/projected/69a682b8-88ce-47ea-b82c-77ceb490cccf-kube-api-access-n8szh\") pod \"community-operators-xlzjz\" (UID: \"69a682b8-88ce-47ea-b82c-77ceb490cccf\") " pod="openshift-marketplace/community-operators-xlzjz" Feb 17 10:02:35 crc kubenswrapper[4848]: I0217 10:02:35.706987 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69a682b8-88ce-47ea-b82c-77ceb490cccf-utilities\") pod \"community-operators-xlzjz\" (UID: \"69a682b8-88ce-47ea-b82c-77ceb490cccf\") " pod="openshift-marketplace/community-operators-xlzjz" Feb 17 10:02:35 crc kubenswrapper[4848]: I0217 10:02:35.707038 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69a682b8-88ce-47ea-b82c-77ceb490cccf-catalog-content\") pod \"community-operators-xlzjz\" (UID: \"69a682b8-88ce-47ea-b82c-77ceb490cccf\") " pod="openshift-marketplace/community-operators-xlzjz" Feb 17 10:02:35 crc kubenswrapper[4848]: I0217 10:02:35.809179 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69a682b8-88ce-47ea-b82c-77ceb490cccf-utilities\") pod \"community-operators-xlzjz\" (UID: \"69a682b8-88ce-47ea-b82c-77ceb490cccf\") " pod="openshift-marketplace/community-operators-xlzjz" Feb 17 10:02:35 crc kubenswrapper[4848]: I0217 10:02:35.809280 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69a682b8-88ce-47ea-b82c-77ceb490cccf-catalog-content\") pod \"community-operators-xlzjz\" (UID: \"69a682b8-88ce-47ea-b82c-77ceb490cccf\") " pod="openshift-marketplace/community-operators-xlzjz" Feb 17 10:02:35 crc kubenswrapper[4848]: I0217 10:02:35.809368 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8szh\" (UniqueName: \"kubernetes.io/projected/69a682b8-88ce-47ea-b82c-77ceb490cccf-kube-api-access-n8szh\") pod \"community-operators-xlzjz\" (UID: \"69a682b8-88ce-47ea-b82c-77ceb490cccf\") " pod="openshift-marketplace/community-operators-xlzjz" Feb 17 10:02:35 crc kubenswrapper[4848]: I0217 10:02:35.809686 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69a682b8-88ce-47ea-b82c-77ceb490cccf-utilities\") pod \"community-operators-xlzjz\" (UID: \"69a682b8-88ce-47ea-b82c-77ceb490cccf\") " pod="openshift-marketplace/community-operators-xlzjz" Feb 17 10:02:35 crc kubenswrapper[4848]: I0217 10:02:35.809708 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69a682b8-88ce-47ea-b82c-77ceb490cccf-catalog-content\") pod \"community-operators-xlzjz\" (UID: \"69a682b8-88ce-47ea-b82c-77ceb490cccf\") " pod="openshift-marketplace/community-operators-xlzjz" Feb 17 10:02:35 crc kubenswrapper[4848]: I0217 10:02:35.832160 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8szh\" (UniqueName: \"kubernetes.io/projected/69a682b8-88ce-47ea-b82c-77ceb490cccf-kube-api-access-n8szh\") pod \"community-operators-xlzjz\" (UID: \"69a682b8-88ce-47ea-b82c-77ceb490cccf\") " pod="openshift-marketplace/community-operators-xlzjz" Feb 17 10:02:35 crc kubenswrapper[4848]: I0217 10:02:35.935936 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xlzjz" Feb 17 10:02:36 crc kubenswrapper[4848]: I0217 10:02:36.565489 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xlzjz"] Feb 17 10:02:37 crc kubenswrapper[4848]: I0217 10:02:37.215617 4848 generic.go:334] "Generic (PLEG): container finished" podID="69a682b8-88ce-47ea-b82c-77ceb490cccf" containerID="9643e378fccff2d54e8b56075e077838cafb98cebcd33ea1fad9c71a47c07c52" exitCode=0 Feb 17 10:02:37 crc kubenswrapper[4848]: I0217 10:02:37.215703 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlzjz" event={"ID":"69a682b8-88ce-47ea-b82c-77ceb490cccf","Type":"ContainerDied","Data":"9643e378fccff2d54e8b56075e077838cafb98cebcd33ea1fad9c71a47c07c52"} Feb 17 10:02:37 crc kubenswrapper[4848]: I0217 10:02:37.215960 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlzjz" event={"ID":"69a682b8-88ce-47ea-b82c-77ceb490cccf","Type":"ContainerStarted","Data":"dcffa0949177f30e8b3a0b3f04930ec3e5046231ae25a5ece71f6fee07689cf6"} Feb 17 10:02:37 crc kubenswrapper[4848]: I0217 10:02:37.218277 4848 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 10:02:38 crc kubenswrapper[4848]: I0217 10:02:38.230039 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlzjz" event={"ID":"69a682b8-88ce-47ea-b82c-77ceb490cccf","Type":"ContainerStarted","Data":"b837da4963c15e764e3b51ed56087edc629c83c100027d7f9228581e9d7fa4e7"} Feb 17 10:02:41 crc kubenswrapper[4848]: I0217 10:02:41.266791 4848 generic.go:334] "Generic (PLEG): container finished" podID="69a682b8-88ce-47ea-b82c-77ceb490cccf" containerID="b837da4963c15e764e3b51ed56087edc629c83c100027d7f9228581e9d7fa4e7" exitCode=0 Feb 17 10:02:41 crc kubenswrapper[4848]: I0217 10:02:41.267000 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlzjz" event={"ID":"69a682b8-88ce-47ea-b82c-77ceb490cccf","Type":"ContainerDied","Data":"b837da4963c15e764e3b51ed56087edc629c83c100027d7f9228581e9d7fa4e7"} Feb 17 10:02:42 crc kubenswrapper[4848]: I0217 10:02:42.277607 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlzjz" event={"ID":"69a682b8-88ce-47ea-b82c-77ceb490cccf","Type":"ContainerStarted","Data":"afae6ed7e9d0f3a1ddcef1d3fb4fbe14d19a6655ad150c42d0502517c920d088"} Feb 17 10:02:42 crc kubenswrapper[4848]: I0217 10:02:42.308278 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xlzjz" podStartSLOduration=2.7924076270000002 podStartE2EDuration="7.308262868s" podCreationTimestamp="2026-02-17 10:02:35 +0000 UTC" firstStartedPulling="2026-02-17 10:02:37.217784796 +0000 UTC m=+3434.761040512" lastFinishedPulling="2026-02-17 10:02:41.733640087 +0000 UTC m=+3439.276895753" observedRunningTime="2026-02-17 10:02:42.30410373 +0000 UTC m=+3439.847359396" watchObservedRunningTime="2026-02-17 10:02:42.308262868 +0000 UTC m=+3439.851518514" Feb 17 10:02:45 crc kubenswrapper[4848]: I0217 10:02:45.936952 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xlzjz" Feb 17 10:02:45 crc kubenswrapper[4848]: I0217 10:02:45.937519 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xlzjz" Feb 17 10:02:45 crc kubenswrapper[4848]: I0217 10:02:45.997525 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xlzjz" Feb 17 10:02:46 crc kubenswrapper[4848]: I0217 10:02:46.433748 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xlzjz" Feb 17 10:02:46 crc kubenswrapper[4848]: I0217 10:02:46.503754 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xlzjz"] Feb 17 10:02:48 crc kubenswrapper[4848]: I0217 10:02:48.379243 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xlzjz" podUID="69a682b8-88ce-47ea-b82c-77ceb490cccf" containerName="registry-server" containerID="cri-o://afae6ed7e9d0f3a1ddcef1d3fb4fbe14d19a6655ad150c42d0502517c920d088" gracePeriod=2 Feb 17 10:02:48 crc kubenswrapper[4848]: I0217 10:02:48.833086 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xlzjz" Feb 17 10:02:48 crc kubenswrapper[4848]: I0217 10:02:48.912866 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69a682b8-88ce-47ea-b82c-77ceb490cccf-catalog-content\") pod \"69a682b8-88ce-47ea-b82c-77ceb490cccf\" (UID: \"69a682b8-88ce-47ea-b82c-77ceb490cccf\") " Feb 17 10:02:48 crc kubenswrapper[4848]: I0217 10:02:48.912976 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69a682b8-88ce-47ea-b82c-77ceb490cccf-utilities\") pod \"69a682b8-88ce-47ea-b82c-77ceb490cccf\" (UID: \"69a682b8-88ce-47ea-b82c-77ceb490cccf\") " Feb 17 10:02:48 crc kubenswrapper[4848]: I0217 10:02:48.913018 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8szh\" (UniqueName: \"kubernetes.io/projected/69a682b8-88ce-47ea-b82c-77ceb490cccf-kube-api-access-n8szh\") pod \"69a682b8-88ce-47ea-b82c-77ceb490cccf\" (UID: \"69a682b8-88ce-47ea-b82c-77ceb490cccf\") " Feb 17 10:02:48 crc kubenswrapper[4848]: I0217 10:02:48.914575 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69a682b8-88ce-47ea-b82c-77ceb490cccf-utilities" (OuterVolumeSpecName: "utilities") pod "69a682b8-88ce-47ea-b82c-77ceb490cccf" (UID: "69a682b8-88ce-47ea-b82c-77ceb490cccf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 10:02:48 crc kubenswrapper[4848]: I0217 10:02:48.919525 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69a682b8-88ce-47ea-b82c-77ceb490cccf-kube-api-access-n8szh" (OuterVolumeSpecName: "kube-api-access-n8szh") pod "69a682b8-88ce-47ea-b82c-77ceb490cccf" (UID: "69a682b8-88ce-47ea-b82c-77ceb490cccf"). InnerVolumeSpecName "kube-api-access-n8szh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 10:02:48 crc kubenswrapper[4848]: I0217 10:02:48.994241 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69a682b8-88ce-47ea-b82c-77ceb490cccf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69a682b8-88ce-47ea-b82c-77ceb490cccf" (UID: "69a682b8-88ce-47ea-b82c-77ceb490cccf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 10:02:49 crc kubenswrapper[4848]: I0217 10:02:49.015640 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69a682b8-88ce-47ea-b82c-77ceb490cccf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 10:02:49 crc kubenswrapper[4848]: I0217 10:02:49.015683 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69a682b8-88ce-47ea-b82c-77ceb490cccf-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 10:02:49 crc kubenswrapper[4848]: I0217 10:02:49.015696 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8szh\" (UniqueName: \"kubernetes.io/projected/69a682b8-88ce-47ea-b82c-77ceb490cccf-kube-api-access-n8szh\") on node \"crc\" DevicePath \"\"" Feb 17 10:02:49 crc kubenswrapper[4848]: I0217 10:02:49.392961 4848 generic.go:334] "Generic (PLEG): container finished" podID="69a682b8-88ce-47ea-b82c-77ceb490cccf" containerID="afae6ed7e9d0f3a1ddcef1d3fb4fbe14d19a6655ad150c42d0502517c920d088" exitCode=0 Feb 17 10:02:49 crc kubenswrapper[4848]: I0217 10:02:49.393056 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xlzjz" Feb 17 10:02:49 crc kubenswrapper[4848]: I0217 10:02:49.397116 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlzjz" event={"ID":"69a682b8-88ce-47ea-b82c-77ceb490cccf","Type":"ContainerDied","Data":"afae6ed7e9d0f3a1ddcef1d3fb4fbe14d19a6655ad150c42d0502517c920d088"} Feb 17 10:02:49 crc kubenswrapper[4848]: I0217 10:02:49.397443 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlzjz" event={"ID":"69a682b8-88ce-47ea-b82c-77ceb490cccf","Type":"ContainerDied","Data":"dcffa0949177f30e8b3a0b3f04930ec3e5046231ae25a5ece71f6fee07689cf6"} Feb 17 10:02:49 crc kubenswrapper[4848]: I0217 10:02:49.397467 4848 scope.go:117] "RemoveContainer" containerID="afae6ed7e9d0f3a1ddcef1d3fb4fbe14d19a6655ad150c42d0502517c920d088" Feb 17 10:02:49 crc kubenswrapper[4848]: I0217 10:02:49.419101 4848 scope.go:117] "RemoveContainer" containerID="b837da4963c15e764e3b51ed56087edc629c83c100027d7f9228581e9d7fa4e7" Feb 17 10:02:49 crc kubenswrapper[4848]: I0217 10:02:49.452714 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xlzjz"] Feb 17 10:02:49 crc kubenswrapper[4848]: I0217 10:02:49.453360 4848 scope.go:117] "RemoveContainer" containerID="9643e378fccff2d54e8b56075e077838cafb98cebcd33ea1fad9c71a47c07c52" Feb 17 10:02:49 crc kubenswrapper[4848]: I0217 10:02:49.465531 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xlzjz"] Feb 17 10:02:49 crc kubenswrapper[4848]: I0217 10:02:49.507451 4848 scope.go:117] "RemoveContainer" containerID="afae6ed7e9d0f3a1ddcef1d3fb4fbe14d19a6655ad150c42d0502517c920d088" Feb 17 10:02:49 crc kubenswrapper[4848]: E0217 10:02:49.507861 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afae6ed7e9d0f3a1ddcef1d3fb4fbe14d19a6655ad150c42d0502517c920d088\": container with ID starting with afae6ed7e9d0f3a1ddcef1d3fb4fbe14d19a6655ad150c42d0502517c920d088 not found: ID does not exist" containerID="afae6ed7e9d0f3a1ddcef1d3fb4fbe14d19a6655ad150c42d0502517c920d088" Feb 17 10:02:49 crc kubenswrapper[4848]: I0217 10:02:49.507891 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afae6ed7e9d0f3a1ddcef1d3fb4fbe14d19a6655ad150c42d0502517c920d088"} err="failed to get container status \"afae6ed7e9d0f3a1ddcef1d3fb4fbe14d19a6655ad150c42d0502517c920d088\": rpc error: code = NotFound desc = could not find container \"afae6ed7e9d0f3a1ddcef1d3fb4fbe14d19a6655ad150c42d0502517c920d088\": container with ID starting with afae6ed7e9d0f3a1ddcef1d3fb4fbe14d19a6655ad150c42d0502517c920d088 not found: ID does not exist" Feb 17 10:02:49 crc kubenswrapper[4848]: I0217 10:02:49.507912 4848 scope.go:117] "RemoveContainer" containerID="b837da4963c15e764e3b51ed56087edc629c83c100027d7f9228581e9d7fa4e7" Feb 17 10:02:49 crc kubenswrapper[4848]: E0217 10:02:49.508361 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b837da4963c15e764e3b51ed56087edc629c83c100027d7f9228581e9d7fa4e7\": container with ID starting with b837da4963c15e764e3b51ed56087edc629c83c100027d7f9228581e9d7fa4e7 not found: ID does not exist" containerID="b837da4963c15e764e3b51ed56087edc629c83c100027d7f9228581e9d7fa4e7" Feb 17 10:02:49 crc kubenswrapper[4848]: I0217 10:02:49.508403 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b837da4963c15e764e3b51ed56087edc629c83c100027d7f9228581e9d7fa4e7"} err="failed to get container status \"b837da4963c15e764e3b51ed56087edc629c83c100027d7f9228581e9d7fa4e7\": rpc error: code = NotFound desc = could not find container \"b837da4963c15e764e3b51ed56087edc629c83c100027d7f9228581e9d7fa4e7\": container with ID starting with b837da4963c15e764e3b51ed56087edc629c83c100027d7f9228581e9d7fa4e7 not found: ID does not exist" Feb 17 10:02:49 crc kubenswrapper[4848]: I0217 10:02:49.508428 4848 scope.go:117] "RemoveContainer" containerID="9643e378fccff2d54e8b56075e077838cafb98cebcd33ea1fad9c71a47c07c52" Feb 17 10:02:49 crc kubenswrapper[4848]: E0217 10:02:49.508748 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9643e378fccff2d54e8b56075e077838cafb98cebcd33ea1fad9c71a47c07c52\": container with ID starting with 9643e378fccff2d54e8b56075e077838cafb98cebcd33ea1fad9c71a47c07c52 not found: ID does not exist" containerID="9643e378fccff2d54e8b56075e077838cafb98cebcd33ea1fad9c71a47c07c52" Feb 17 10:02:49 crc kubenswrapper[4848]: I0217 10:02:49.508806 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9643e378fccff2d54e8b56075e077838cafb98cebcd33ea1fad9c71a47c07c52"} err="failed to get container status \"9643e378fccff2d54e8b56075e077838cafb98cebcd33ea1fad9c71a47c07c52\": rpc error: code = NotFound desc = could not find container \"9643e378fccff2d54e8b56075e077838cafb98cebcd33ea1fad9c71a47c07c52\": container with ID starting with 9643e378fccff2d54e8b56075e077838cafb98cebcd33ea1fad9c71a47c07c52 not found: ID does not exist" Feb 17 10:02:51 crc kubenswrapper[4848]: I0217 10:02:51.402536 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69a682b8-88ce-47ea-b82c-77ceb490cccf" path="/var/lib/kubelet/pods/69a682b8-88ce-47ea-b82c-77ceb490cccf/volumes" Feb 17 10:02:53 crc kubenswrapper[4848]: I0217 10:02:53.460929 4848 generic.go:334] "Generic (PLEG): container finished" podID="0be33e07-83c0-4b4d-b24c-3b9c98467671" containerID="d33a65e48d72e651fb49d9ed7d1b745f59c47927bc23b8032ab0bfd5d48114d9" exitCode=0 Feb 17 10:02:53 crc kubenswrapper[4848]: I0217 10:02:53.461056 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zndr7/must-gather-2cxt8" event={"ID":"0be33e07-83c0-4b4d-b24c-3b9c98467671","Type":"ContainerDied","Data":"d33a65e48d72e651fb49d9ed7d1b745f59c47927bc23b8032ab0bfd5d48114d9"} Feb 17 10:02:53 crc kubenswrapper[4848]: I0217 10:02:53.462048 4848 scope.go:117] "RemoveContainer" containerID="d33a65e48d72e651fb49d9ed7d1b745f59c47927bc23b8032ab0bfd5d48114d9" Feb 17 10:02:54 crc kubenswrapper[4848]: I0217 10:02:54.150255 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zndr7_must-gather-2cxt8_0be33e07-83c0-4b4d-b24c-3b9c98467671/gather/0.log" Feb 17 10:03:02 crc kubenswrapper[4848]: I0217 10:03:02.657127 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zndr7/must-gather-2cxt8"] Feb 17 10:03:02 crc kubenswrapper[4848]: I0217 10:03:02.657960 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-zndr7/must-gather-2cxt8" podUID="0be33e07-83c0-4b4d-b24c-3b9c98467671" containerName="copy" containerID="cri-o://2ba7a47f6994e09b85c08c06328e9151c2e22c9bbbeb43fdbf9b875b5025300e" gracePeriod=2 Feb 17 10:03:02 crc kubenswrapper[4848]: I0217 10:03:02.665843 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zndr7/must-gather-2cxt8"] Feb 17 10:03:03 crc kubenswrapper[4848]: I0217 10:03:03.133540 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zndr7_must-gather-2cxt8_0be33e07-83c0-4b4d-b24c-3b9c98467671/copy/0.log" Feb 17 10:03:03 crc kubenswrapper[4848]: I0217 10:03:03.134567 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zndr7/must-gather-2cxt8" Feb 17 10:03:03 crc kubenswrapper[4848]: I0217 10:03:03.290683 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gw2k2\" (UniqueName: \"kubernetes.io/projected/0be33e07-83c0-4b4d-b24c-3b9c98467671-kube-api-access-gw2k2\") pod \"0be33e07-83c0-4b4d-b24c-3b9c98467671\" (UID: \"0be33e07-83c0-4b4d-b24c-3b9c98467671\") " Feb 17 10:03:03 crc kubenswrapper[4848]: I0217 10:03:03.290863 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0be33e07-83c0-4b4d-b24c-3b9c98467671-must-gather-output\") pod \"0be33e07-83c0-4b4d-b24c-3b9c98467671\" (UID: \"0be33e07-83c0-4b4d-b24c-3b9c98467671\") " Feb 17 10:03:03 crc kubenswrapper[4848]: I0217 10:03:03.295491 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0be33e07-83c0-4b4d-b24c-3b9c98467671-kube-api-access-gw2k2" (OuterVolumeSpecName: "kube-api-access-gw2k2") pod "0be33e07-83c0-4b4d-b24c-3b9c98467671" (UID: "0be33e07-83c0-4b4d-b24c-3b9c98467671"). InnerVolumeSpecName "kube-api-access-gw2k2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 10:03:03 crc kubenswrapper[4848]: I0217 10:03:03.394545 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gw2k2\" (UniqueName: \"kubernetes.io/projected/0be33e07-83c0-4b4d-b24c-3b9c98467671-kube-api-access-gw2k2\") on node \"crc\" DevicePath \"\"" Feb 17 10:03:03 crc kubenswrapper[4848]: I0217 10:03:03.469194 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0be33e07-83c0-4b4d-b24c-3b9c98467671-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0be33e07-83c0-4b4d-b24c-3b9c98467671" (UID: "0be33e07-83c0-4b4d-b24c-3b9c98467671"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 10:03:03 crc kubenswrapper[4848]: I0217 10:03:03.496334 4848 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0be33e07-83c0-4b4d-b24c-3b9c98467671-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 17 10:03:03 crc kubenswrapper[4848]: I0217 10:03:03.556063 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zndr7_must-gather-2cxt8_0be33e07-83c0-4b4d-b24c-3b9c98467671/copy/0.log" Feb 17 10:03:03 crc kubenswrapper[4848]: I0217 10:03:03.556484 4848 generic.go:334] "Generic (PLEG): container finished" podID="0be33e07-83c0-4b4d-b24c-3b9c98467671" containerID="2ba7a47f6994e09b85c08c06328e9151c2e22c9bbbeb43fdbf9b875b5025300e" exitCode=143 Feb 17 10:03:03 crc kubenswrapper[4848]: I0217 10:03:03.557029 4848 scope.go:117] "RemoveContainer" containerID="2ba7a47f6994e09b85c08c06328e9151c2e22c9bbbeb43fdbf9b875b5025300e" Feb 17 10:03:03 crc kubenswrapper[4848]: I0217 10:03:03.557118 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zndr7/must-gather-2cxt8" Feb 17 10:03:03 crc kubenswrapper[4848]: I0217 10:03:03.577621 4848 scope.go:117] "RemoveContainer" containerID="d33a65e48d72e651fb49d9ed7d1b745f59c47927bc23b8032ab0bfd5d48114d9" Feb 17 10:03:03 crc kubenswrapper[4848]: I0217 10:03:03.654661 4848 scope.go:117] "RemoveContainer" containerID="2ba7a47f6994e09b85c08c06328e9151c2e22c9bbbeb43fdbf9b875b5025300e" Feb 17 10:03:03 crc kubenswrapper[4848]: E0217 10:03:03.655113 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ba7a47f6994e09b85c08c06328e9151c2e22c9bbbeb43fdbf9b875b5025300e\": container with ID starting with 2ba7a47f6994e09b85c08c06328e9151c2e22c9bbbeb43fdbf9b875b5025300e not found: ID does not exist" containerID="2ba7a47f6994e09b85c08c06328e9151c2e22c9bbbeb43fdbf9b875b5025300e" Feb 17 10:03:03 crc kubenswrapper[4848]: I0217 10:03:03.655156 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba7a47f6994e09b85c08c06328e9151c2e22c9bbbeb43fdbf9b875b5025300e"} err="failed to get container status \"2ba7a47f6994e09b85c08c06328e9151c2e22c9bbbeb43fdbf9b875b5025300e\": rpc error: code = NotFound desc = could not find container \"2ba7a47f6994e09b85c08c06328e9151c2e22c9bbbeb43fdbf9b875b5025300e\": container with ID starting with 2ba7a47f6994e09b85c08c06328e9151c2e22c9bbbeb43fdbf9b875b5025300e not found: ID does not exist" Feb 17 10:03:03 crc kubenswrapper[4848]: I0217 10:03:03.655182 4848 scope.go:117] "RemoveContainer" containerID="d33a65e48d72e651fb49d9ed7d1b745f59c47927bc23b8032ab0bfd5d48114d9" Feb 17 10:03:03 crc kubenswrapper[4848]: E0217 10:03:03.655513 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d33a65e48d72e651fb49d9ed7d1b745f59c47927bc23b8032ab0bfd5d48114d9\": container with ID starting with d33a65e48d72e651fb49d9ed7d1b745f59c47927bc23b8032ab0bfd5d48114d9 not found: ID does not exist" containerID="d33a65e48d72e651fb49d9ed7d1b745f59c47927bc23b8032ab0bfd5d48114d9" Feb 17 10:03:03 crc kubenswrapper[4848]: I0217 10:03:03.655570 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d33a65e48d72e651fb49d9ed7d1b745f59c47927bc23b8032ab0bfd5d48114d9"} err="failed to get container status \"d33a65e48d72e651fb49d9ed7d1b745f59c47927bc23b8032ab0bfd5d48114d9\": rpc error: code = NotFound desc = could not find container \"d33a65e48d72e651fb49d9ed7d1b745f59c47927bc23b8032ab0bfd5d48114d9\": container with ID starting with d33a65e48d72e651fb49d9ed7d1b745f59c47927bc23b8032ab0bfd5d48114d9 not found: ID does not exist" Feb 17 10:03:05 crc kubenswrapper[4848]: I0217 10:03:05.403726 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0be33e07-83c0-4b4d-b24c-3b9c98467671" path="/var/lib/kubelet/pods/0be33e07-83c0-4b4d-b24c-3b9c98467671/volumes" Feb 17 10:03:47 crc kubenswrapper[4848]: I0217 10:03:47.755942 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6v222"] Feb 17 10:03:47 crc kubenswrapper[4848]: E0217 10:03:47.758350 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0be33e07-83c0-4b4d-b24c-3b9c98467671" containerName="copy" Feb 17 10:03:47 crc kubenswrapper[4848]: I0217 10:03:47.758380 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="0be33e07-83c0-4b4d-b24c-3b9c98467671" containerName="copy" Feb 17 10:03:47 crc kubenswrapper[4848]: E0217 10:03:47.758418 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0be33e07-83c0-4b4d-b24c-3b9c98467671" containerName="gather" Feb 17 10:03:47 crc kubenswrapper[4848]: I0217 10:03:47.758430 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="0be33e07-83c0-4b4d-b24c-3b9c98467671" containerName="gather" Feb 17 10:03:47 crc kubenswrapper[4848]: E0217 10:03:47.758457 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69a682b8-88ce-47ea-b82c-77ceb490cccf" containerName="extract-content" Feb 17 10:03:47 crc kubenswrapper[4848]: I0217 10:03:47.758473 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a682b8-88ce-47ea-b82c-77ceb490cccf" containerName="extract-content" Feb 17 10:03:47 crc kubenswrapper[4848]: E0217 10:03:47.758532 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69a682b8-88ce-47ea-b82c-77ceb490cccf" containerName="registry-server" Feb 17 10:03:47 crc kubenswrapper[4848]: I0217 10:03:47.758548 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a682b8-88ce-47ea-b82c-77ceb490cccf" containerName="registry-server" Feb 17 10:03:47 crc kubenswrapper[4848]: E0217 10:03:47.758582 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69a682b8-88ce-47ea-b82c-77ceb490cccf" containerName="extract-utilities" Feb 17 10:03:47 crc kubenswrapper[4848]: I0217 10:03:47.758595 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="69a682b8-88ce-47ea-b82c-77ceb490cccf" containerName="extract-utilities" Feb 17 10:03:47 crc kubenswrapper[4848]: I0217 10:03:47.758993 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="69a682b8-88ce-47ea-b82c-77ceb490cccf" containerName="registry-server" Feb 17 10:03:47 crc kubenswrapper[4848]: I0217 10:03:47.759028 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="0be33e07-83c0-4b4d-b24c-3b9c98467671" containerName="copy" Feb 17 10:03:47 crc kubenswrapper[4848]: I0217 10:03:47.759049 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="0be33e07-83c0-4b4d-b24c-3b9c98467671" containerName="gather" Feb 17 10:03:47 crc kubenswrapper[4848]: I0217 10:03:47.764302 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6v222" Feb 17 10:03:47 crc kubenswrapper[4848]: I0217 10:03:47.770785 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6v222"] Feb 17 10:03:47 crc kubenswrapper[4848]: I0217 10:03:47.827278 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mk4bk\" (UniqueName: \"kubernetes.io/projected/7024f506-1f8f-4626-870a-3f64bacda12d-kube-api-access-mk4bk\") pod \"certified-operators-6v222\" (UID: \"7024f506-1f8f-4626-870a-3f64bacda12d\") " pod="openshift-marketplace/certified-operators-6v222" Feb 17 10:03:47 crc kubenswrapper[4848]: I0217 10:03:47.827659 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7024f506-1f8f-4626-870a-3f64bacda12d-utilities\") pod \"certified-operators-6v222\" (UID: \"7024f506-1f8f-4626-870a-3f64bacda12d\") " pod="openshift-marketplace/certified-operators-6v222" Feb 17 10:03:47 crc kubenswrapper[4848]: I0217 10:03:47.827773 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7024f506-1f8f-4626-870a-3f64bacda12d-catalog-content\") pod \"certified-operators-6v222\" (UID: \"7024f506-1f8f-4626-870a-3f64bacda12d\") " pod="openshift-marketplace/certified-operators-6v222" Feb 17 10:03:47 crc kubenswrapper[4848]: I0217 10:03:47.929493 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7024f506-1f8f-4626-870a-3f64bacda12d-utilities\") pod \"certified-operators-6v222\" (UID: \"7024f506-1f8f-4626-870a-3f64bacda12d\") " pod="openshift-marketplace/certified-operators-6v222" Feb 17 10:03:47 crc kubenswrapper[4848]: I0217 10:03:47.929961 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7024f506-1f8f-4626-870a-3f64bacda12d-catalog-content\") pod \"certified-operators-6v222\" (UID: \"7024f506-1f8f-4626-870a-3f64bacda12d\") " pod="openshift-marketplace/certified-operators-6v222" Feb 17 10:03:47 crc kubenswrapper[4848]: I0217 10:03:47.930061 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7024f506-1f8f-4626-870a-3f64bacda12d-utilities\") pod \"certified-operators-6v222\" (UID: \"7024f506-1f8f-4626-870a-3f64bacda12d\") " pod="openshift-marketplace/certified-operators-6v222" Feb 17 10:03:47 crc kubenswrapper[4848]: I0217 10:03:47.930242 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mk4bk\" (UniqueName: \"kubernetes.io/projected/7024f506-1f8f-4626-870a-3f64bacda12d-kube-api-access-mk4bk\") pod \"certified-operators-6v222\" (UID: \"7024f506-1f8f-4626-870a-3f64bacda12d\") " pod="openshift-marketplace/certified-operators-6v222" Feb 17 10:03:47 crc kubenswrapper[4848]: I0217 10:03:47.930450 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7024f506-1f8f-4626-870a-3f64bacda12d-catalog-content\") pod \"certified-operators-6v222\" (UID: \"7024f506-1f8f-4626-870a-3f64bacda12d\") " pod="openshift-marketplace/certified-operators-6v222" Feb 17 10:03:47 crc kubenswrapper[4848]: I0217 10:03:47.953749 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mk4bk\" (UniqueName: \"kubernetes.io/projected/7024f506-1f8f-4626-870a-3f64bacda12d-kube-api-access-mk4bk\") pod \"certified-operators-6v222\" (UID: \"7024f506-1f8f-4626-870a-3f64bacda12d\") " pod="openshift-marketplace/certified-operators-6v222" Feb 17 10:03:48 crc kubenswrapper[4848]: I0217 10:03:48.099473 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6v222" Feb 17 10:03:48 crc kubenswrapper[4848]: I0217 10:03:48.631090 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6v222"] Feb 17 10:03:49 crc kubenswrapper[4848]: I0217 10:03:49.133665 4848 generic.go:334] "Generic (PLEG): container finished" podID="7024f506-1f8f-4626-870a-3f64bacda12d" containerID="662618f2ccf3597b275742e32e50da40ee03989beb21a98acc6866cc7ca8eb52" exitCode=0 Feb 17 10:03:49 crc kubenswrapper[4848]: I0217 10:03:49.133947 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6v222" event={"ID":"7024f506-1f8f-4626-870a-3f64bacda12d","Type":"ContainerDied","Data":"662618f2ccf3597b275742e32e50da40ee03989beb21a98acc6866cc7ca8eb52"} Feb 17 10:03:49 crc kubenswrapper[4848]: I0217 10:03:49.134030 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6v222" event={"ID":"7024f506-1f8f-4626-870a-3f64bacda12d","Type":"ContainerStarted","Data":"005a99647354a01d2c3b53460b59ed06dc6d670d51271055d8863b81c3c639cb"} Feb 17 10:03:50 crc kubenswrapper[4848]: I0217 10:03:50.145578 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6v222" event={"ID":"7024f506-1f8f-4626-870a-3f64bacda12d","Type":"ContainerStarted","Data":"cb0d52a9b83eb10b629ce1e596b34df93f5331e3ae91cf859fe803bb32631e5a"} Feb 17 10:03:52 crc kubenswrapper[4848]: I0217 10:03:52.169522 4848 generic.go:334] "Generic (PLEG): container finished" podID="7024f506-1f8f-4626-870a-3f64bacda12d" containerID="cb0d52a9b83eb10b629ce1e596b34df93f5331e3ae91cf859fe803bb32631e5a" exitCode=0 Feb 17 10:03:52 crc kubenswrapper[4848]: I0217 10:03:52.169610 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6v222" event={"ID":"7024f506-1f8f-4626-870a-3f64bacda12d","Type":"ContainerDied","Data":"cb0d52a9b83eb10b629ce1e596b34df93f5331e3ae91cf859fe803bb32631e5a"} Feb 17 10:03:54 crc kubenswrapper[4848]: I0217 10:03:54.194035 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6v222" event={"ID":"7024f506-1f8f-4626-870a-3f64bacda12d","Type":"ContainerStarted","Data":"12114aa746ecaf1932e4fdae7a924e963f74bad03bb3a09024067083406085d5"} Feb 17 10:03:54 crc kubenswrapper[4848]: I0217 10:03:54.219692 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6v222" podStartSLOduration=3.284768168 podStartE2EDuration="7.219668002s" podCreationTimestamp="2026-02-17 10:03:47 +0000 UTC" firstStartedPulling="2026-02-17 10:03:49.135976611 +0000 UTC m=+3506.679232317" lastFinishedPulling="2026-02-17 10:03:53.070876505 +0000 UTC m=+3510.614132151" observedRunningTime="2026-02-17 10:03:54.213750354 +0000 UTC m=+3511.757006000" watchObservedRunningTime="2026-02-17 10:03:54.219668002 +0000 UTC m=+3511.762923658" Feb 17 10:03:58 crc kubenswrapper[4848]: I0217 10:03:58.100075 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6v222" Feb 17 10:03:58 crc kubenswrapper[4848]: I0217 10:03:58.100950 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6v222" Feb 17 10:03:58 crc kubenswrapper[4848]: I0217 10:03:58.167389 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6v222" Feb 17 10:03:58 crc kubenswrapper[4848]: I0217 10:03:58.289723 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6v222" Feb 17 10:03:58 crc kubenswrapper[4848]: I0217 10:03:58.405489 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6v222"] Feb 17 10:04:00 crc kubenswrapper[4848]: I0217 10:04:00.260138 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6v222" podUID="7024f506-1f8f-4626-870a-3f64bacda12d" containerName="registry-server" containerID="cri-o://12114aa746ecaf1932e4fdae7a924e963f74bad03bb3a09024067083406085d5" gracePeriod=2 Feb 17 10:04:01 crc kubenswrapper[4848]: I0217 10:04:01.194145 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6v222" Feb 17 10:04:01 crc kubenswrapper[4848]: I0217 10:04:01.270181 4848 generic.go:334] "Generic (PLEG): container finished" podID="7024f506-1f8f-4626-870a-3f64bacda12d" containerID="12114aa746ecaf1932e4fdae7a924e963f74bad03bb3a09024067083406085d5" exitCode=0 Feb 17 10:04:01 crc kubenswrapper[4848]: I0217 10:04:01.270236 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6v222" event={"ID":"7024f506-1f8f-4626-870a-3f64bacda12d","Type":"ContainerDied","Data":"12114aa746ecaf1932e4fdae7a924e963f74bad03bb3a09024067083406085d5"} Feb 17 10:04:01 crc kubenswrapper[4848]: I0217 10:04:01.270975 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6v222" event={"ID":"7024f506-1f8f-4626-870a-3f64bacda12d","Type":"ContainerDied","Data":"005a99647354a01d2c3b53460b59ed06dc6d670d51271055d8863b81c3c639cb"} Feb 17 10:04:01 crc kubenswrapper[4848]: I0217 10:04:01.270278 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6v222" Feb 17 10:04:01 crc kubenswrapper[4848]: I0217 10:04:01.271035 4848 scope.go:117] "RemoveContainer" containerID="12114aa746ecaf1932e4fdae7a924e963f74bad03bb3a09024067083406085d5" Feb 17 10:04:01 crc kubenswrapper[4848]: I0217 10:04:01.290876 4848 scope.go:117] "RemoveContainer" containerID="cb0d52a9b83eb10b629ce1e596b34df93f5331e3ae91cf859fe803bb32631e5a" Feb 17 10:04:01 crc kubenswrapper[4848]: I0217 10:04:01.315093 4848 scope.go:117] "RemoveContainer" containerID="662618f2ccf3597b275742e32e50da40ee03989beb21a98acc6866cc7ca8eb52" Feb 17 10:04:01 crc kubenswrapper[4848]: I0217 10:04:01.318277 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7024f506-1f8f-4626-870a-3f64bacda12d-catalog-content\") pod \"7024f506-1f8f-4626-870a-3f64bacda12d\" (UID: \"7024f506-1f8f-4626-870a-3f64bacda12d\") " Feb 17 10:04:01 crc kubenswrapper[4848]: I0217 10:04:01.318551 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mk4bk\" (UniqueName: \"kubernetes.io/projected/7024f506-1f8f-4626-870a-3f64bacda12d-kube-api-access-mk4bk\") pod \"7024f506-1f8f-4626-870a-3f64bacda12d\" (UID: \"7024f506-1f8f-4626-870a-3f64bacda12d\") " Feb 17 10:04:01 crc kubenswrapper[4848]: I0217 10:04:01.318692 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7024f506-1f8f-4626-870a-3f64bacda12d-utilities\") pod \"7024f506-1f8f-4626-870a-3f64bacda12d\" (UID: \"7024f506-1f8f-4626-870a-3f64bacda12d\") " Feb 17 10:04:01 crc kubenswrapper[4848]: I0217 10:04:01.319304 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7024f506-1f8f-4626-870a-3f64bacda12d-utilities" (OuterVolumeSpecName: "utilities") pod "7024f506-1f8f-4626-870a-3f64bacda12d" (UID: "7024f506-1f8f-4626-870a-3f64bacda12d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 10:04:01 crc kubenswrapper[4848]: I0217 10:04:01.325565 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7024f506-1f8f-4626-870a-3f64bacda12d-kube-api-access-mk4bk" (OuterVolumeSpecName: "kube-api-access-mk4bk") pod "7024f506-1f8f-4626-870a-3f64bacda12d" (UID: "7024f506-1f8f-4626-870a-3f64bacda12d"). InnerVolumeSpecName "kube-api-access-mk4bk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 10:04:01 crc kubenswrapper[4848]: I0217 10:04:01.386866 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7024f506-1f8f-4626-870a-3f64bacda12d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7024f506-1f8f-4626-870a-3f64bacda12d" (UID: "7024f506-1f8f-4626-870a-3f64bacda12d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 10:04:01 crc kubenswrapper[4848]: I0217 10:04:01.401218 4848 scope.go:117] "RemoveContainer" containerID="12114aa746ecaf1932e4fdae7a924e963f74bad03bb3a09024067083406085d5" Feb 17 10:04:01 crc kubenswrapper[4848]: E0217 10:04:01.401608 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12114aa746ecaf1932e4fdae7a924e963f74bad03bb3a09024067083406085d5\": container with ID starting with 12114aa746ecaf1932e4fdae7a924e963f74bad03bb3a09024067083406085d5 not found: ID does not exist" containerID="12114aa746ecaf1932e4fdae7a924e963f74bad03bb3a09024067083406085d5" Feb 17 10:04:01 crc kubenswrapper[4848]: I0217 10:04:01.401664 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12114aa746ecaf1932e4fdae7a924e963f74bad03bb3a09024067083406085d5"} err="failed to get container status \"12114aa746ecaf1932e4fdae7a924e963f74bad03bb3a09024067083406085d5\": rpc error: code = NotFound desc = could not find container \"12114aa746ecaf1932e4fdae7a924e963f74bad03bb3a09024067083406085d5\": container with ID starting with 12114aa746ecaf1932e4fdae7a924e963f74bad03bb3a09024067083406085d5 not found: ID does not exist" Feb 17 10:04:01 crc kubenswrapper[4848]: I0217 10:04:01.401695 4848 scope.go:117] "RemoveContainer" containerID="cb0d52a9b83eb10b629ce1e596b34df93f5331e3ae91cf859fe803bb32631e5a" Feb 17 10:04:01 crc kubenswrapper[4848]: E0217 10:04:01.402121 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb0d52a9b83eb10b629ce1e596b34df93f5331e3ae91cf859fe803bb32631e5a\": container with ID starting with cb0d52a9b83eb10b629ce1e596b34df93f5331e3ae91cf859fe803bb32631e5a not found: ID does not exist" containerID="cb0d52a9b83eb10b629ce1e596b34df93f5331e3ae91cf859fe803bb32631e5a" Feb 17 10:04:01 crc kubenswrapper[4848]: I0217 10:04:01.402241 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb0d52a9b83eb10b629ce1e596b34df93f5331e3ae91cf859fe803bb32631e5a"} err="failed to get container status \"cb0d52a9b83eb10b629ce1e596b34df93f5331e3ae91cf859fe803bb32631e5a\": rpc error: code = NotFound desc = could not find container \"cb0d52a9b83eb10b629ce1e596b34df93f5331e3ae91cf859fe803bb32631e5a\": container with ID starting with cb0d52a9b83eb10b629ce1e596b34df93f5331e3ae91cf859fe803bb32631e5a not found: ID does not exist" Feb 17 10:04:01 crc kubenswrapper[4848]: I0217 10:04:01.402326 4848 scope.go:117] "RemoveContainer" containerID="662618f2ccf3597b275742e32e50da40ee03989beb21a98acc6866cc7ca8eb52" Feb 17 10:04:01 crc kubenswrapper[4848]: E0217 10:04:01.402706 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"662618f2ccf3597b275742e32e50da40ee03989beb21a98acc6866cc7ca8eb52\": container with ID starting with 662618f2ccf3597b275742e32e50da40ee03989beb21a98acc6866cc7ca8eb52 not found: ID does not exist" containerID="662618f2ccf3597b275742e32e50da40ee03989beb21a98acc6866cc7ca8eb52" Feb 17 10:04:01 crc kubenswrapper[4848]: I0217 10:04:01.402728 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"662618f2ccf3597b275742e32e50da40ee03989beb21a98acc6866cc7ca8eb52"} err="failed to get container status \"662618f2ccf3597b275742e32e50da40ee03989beb21a98acc6866cc7ca8eb52\": rpc error: code = NotFound desc = could not find container \"662618f2ccf3597b275742e32e50da40ee03989beb21a98acc6866cc7ca8eb52\": container with ID starting with 662618f2ccf3597b275742e32e50da40ee03989beb21a98acc6866cc7ca8eb52 not found: ID does not exist" Feb 17 10:04:01 crc kubenswrapper[4848]: I0217 10:04:01.421355 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7024f506-1f8f-4626-870a-3f64bacda12d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 10:04:01 crc kubenswrapper[4848]: I0217 10:04:01.421399 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mk4bk\" (UniqueName: \"kubernetes.io/projected/7024f506-1f8f-4626-870a-3f64bacda12d-kube-api-access-mk4bk\") on node \"crc\" DevicePath \"\"" Feb 17 10:04:01 crc kubenswrapper[4848]: I0217 10:04:01.421433 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7024f506-1f8f-4626-870a-3f64bacda12d-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 10:04:01 crc kubenswrapper[4848]: I0217 10:04:01.606670 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6v222"] Feb 17 10:04:01 crc kubenswrapper[4848]: I0217 10:04:01.621129 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6v222"] Feb 17 10:04:03 crc kubenswrapper[4848]: I0217 10:04:03.397429 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7024f506-1f8f-4626-870a-3f64bacda12d" path="/var/lib/kubelet/pods/7024f506-1f8f-4626-870a-3f64bacda12d/volumes" Feb 17 10:04:29 crc kubenswrapper[4848]: I0217 10:04:29.315589 4848 scope.go:117] "RemoveContainer" containerID="8f6717ae42baab431754271d57fec65e690202617cf6eae950dc50929a27c255" Feb 17 10:04:48 crc kubenswrapper[4848]: I0217 10:04:48.771856 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 10:04:48 crc kubenswrapper[4848]: I0217 10:04:48.772784 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 10:05:18 crc kubenswrapper[4848]: I0217 10:05:18.772268 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 10:05:18 crc kubenswrapper[4848]: I0217 10:05:18.773992 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 10:05:48 crc kubenswrapper[4848]: I0217 10:05:48.772135 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 10:05:48 crc kubenswrapper[4848]: I0217 10:05:48.774357 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 10:05:48 crc kubenswrapper[4848]: I0217 10:05:48.774522 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 10:05:48 crc kubenswrapper[4848]: I0217 10:05:48.775498 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf1fbd7c833a9345e13be53113b338550eebd27fd5bf7c6bcfec90cfb0ca3555"} pod="openshift-machine-config-operator/machine-config-daemon-stvnz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 10:05:48 crc kubenswrapper[4848]: I0217 10:05:48.775724 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" containerID="cri-o://bf1fbd7c833a9345e13be53113b338550eebd27fd5bf7c6bcfec90cfb0ca3555" gracePeriod=600 Feb 17 10:05:49 crc kubenswrapper[4848]: I0217 10:05:49.400654 4848 generic.go:334] "Generic (PLEG): container finished" podID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerID="bf1fbd7c833a9345e13be53113b338550eebd27fd5bf7c6bcfec90cfb0ca3555" exitCode=0 Feb 17 10:05:49 crc kubenswrapper[4848]: I0217 10:05:49.400748 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerDied","Data":"bf1fbd7c833a9345e13be53113b338550eebd27fd5bf7c6bcfec90cfb0ca3555"} Feb 17 10:05:49 crc kubenswrapper[4848]: I0217 10:05:49.401174 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerStarted","Data":"c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab"} Feb 17 10:05:49 crc kubenswrapper[4848]: I0217 10:05:49.401206 4848 scope.go:117] "RemoveContainer" containerID="3fc0ebf6cbe9ac5dc81ac9158218fe1c19006339c0a94ca3f9c3c813d818c5d5" Feb 17 10:05:57 crc kubenswrapper[4848]: I0217 10:05:57.181495 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6g2kf/must-gather-86tjn"] Feb 17 10:05:57 crc kubenswrapper[4848]: E0217 10:05:57.182802 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7024f506-1f8f-4626-870a-3f64bacda12d" containerName="registry-server" Feb 17 10:05:57 crc kubenswrapper[4848]: I0217 10:05:57.182822 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7024f506-1f8f-4626-870a-3f64bacda12d" containerName="registry-server" Feb 17 10:05:57 crc kubenswrapper[4848]: E0217 10:05:57.182845 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7024f506-1f8f-4626-870a-3f64bacda12d" containerName="extract-utilities" Feb 17 10:05:57 crc kubenswrapper[4848]: I0217 10:05:57.182854 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7024f506-1f8f-4626-870a-3f64bacda12d" containerName="extract-utilities" Feb 17 10:05:57 crc kubenswrapper[4848]: E0217 10:05:57.182876 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7024f506-1f8f-4626-870a-3f64bacda12d" containerName="extract-content" Feb 17 10:05:57 crc kubenswrapper[4848]: I0217 10:05:57.182885 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7024f506-1f8f-4626-870a-3f64bacda12d" containerName="extract-content" Feb 17 10:05:57 crc kubenswrapper[4848]: I0217 10:05:57.183092 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="7024f506-1f8f-4626-870a-3f64bacda12d" containerName="registry-server" Feb 17 10:05:57 crc kubenswrapper[4848]: I0217 10:05:57.184340 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6g2kf/must-gather-86tjn" Feb 17 10:05:57 crc kubenswrapper[4848]: I0217 10:05:57.188342 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6g2kf"/"kube-root-ca.crt" Feb 17 10:05:57 crc kubenswrapper[4848]: I0217 10:05:57.188660 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-6g2kf"/"openshift-service-ca.crt" Feb 17 10:05:57 crc kubenswrapper[4848]: I0217 10:05:57.189181 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-6g2kf"/"default-dockercfg-zmkfj" Feb 17 10:05:57 crc kubenswrapper[4848]: I0217 10:05:57.191800 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6g2kf/must-gather-86tjn"] Feb 17 10:05:57 crc kubenswrapper[4848]: I0217 10:05:57.256301 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fsmj\" (UniqueName: \"kubernetes.io/projected/f850b836-0ebb-4173-aa4a-10deba9cfc12-kube-api-access-9fsmj\") pod \"must-gather-86tjn\" (UID: \"f850b836-0ebb-4173-aa4a-10deba9cfc12\") " pod="openshift-must-gather-6g2kf/must-gather-86tjn" Feb 17 10:05:57 crc kubenswrapper[4848]: I0217 10:05:57.256369 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f850b836-0ebb-4173-aa4a-10deba9cfc12-must-gather-output\") pod \"must-gather-86tjn\" (UID: \"f850b836-0ebb-4173-aa4a-10deba9cfc12\") " pod="openshift-must-gather-6g2kf/must-gather-86tjn" Feb 17 10:05:57 crc kubenswrapper[4848]: I0217 10:05:57.358318 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fsmj\" (UniqueName: \"kubernetes.io/projected/f850b836-0ebb-4173-aa4a-10deba9cfc12-kube-api-access-9fsmj\") pod \"must-gather-86tjn\" (UID: \"f850b836-0ebb-4173-aa4a-10deba9cfc12\") " pod="openshift-must-gather-6g2kf/must-gather-86tjn" Feb 17 10:05:57 crc kubenswrapper[4848]: I0217 10:05:57.358388 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f850b836-0ebb-4173-aa4a-10deba9cfc12-must-gather-output\") pod \"must-gather-86tjn\" (UID: \"f850b836-0ebb-4173-aa4a-10deba9cfc12\") " pod="openshift-must-gather-6g2kf/must-gather-86tjn" Feb 17 10:05:57 crc kubenswrapper[4848]: I0217 10:05:57.358894 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f850b836-0ebb-4173-aa4a-10deba9cfc12-must-gather-output\") pod \"must-gather-86tjn\" (UID: \"f850b836-0ebb-4173-aa4a-10deba9cfc12\") " pod="openshift-must-gather-6g2kf/must-gather-86tjn" Feb 17 10:05:57 crc kubenswrapper[4848]: I0217 10:05:57.380901 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fsmj\" (UniqueName: \"kubernetes.io/projected/f850b836-0ebb-4173-aa4a-10deba9cfc12-kube-api-access-9fsmj\") pod \"must-gather-86tjn\" (UID: \"f850b836-0ebb-4173-aa4a-10deba9cfc12\") " pod="openshift-must-gather-6g2kf/must-gather-86tjn" Feb 17 10:05:57 crc kubenswrapper[4848]: I0217 10:05:57.504202 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6g2kf/must-gather-86tjn" Feb 17 10:05:58 crc kubenswrapper[4848]: I0217 10:05:58.012801 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6g2kf/must-gather-86tjn"] Feb 17 10:05:58 crc kubenswrapper[4848]: I0217 10:05:58.510221 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6g2kf/must-gather-86tjn" event={"ID":"f850b836-0ebb-4173-aa4a-10deba9cfc12","Type":"ContainerStarted","Data":"ba1d7be885666b6e845e8678d0b1bbd3ca20044806e92cc4c83fa7af7e26a04a"} Feb 17 10:05:58 crc kubenswrapper[4848]: I0217 10:05:58.510615 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6g2kf/must-gather-86tjn" event={"ID":"f850b836-0ebb-4173-aa4a-10deba9cfc12","Type":"ContainerStarted","Data":"8bcb2c0146880932454308e500aa85e41c0b3354d0beafa7ab1e68f6c27f0303"} Feb 17 10:05:59 crc kubenswrapper[4848]: I0217 10:05:59.521172 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6g2kf/must-gather-86tjn" event={"ID":"f850b836-0ebb-4173-aa4a-10deba9cfc12","Type":"ContainerStarted","Data":"437b6dbd689434194efe853ed6acba48fe7ac099da10e20493d8ca14168461b9"} Feb 17 10:05:59 crc kubenswrapper[4848]: I0217 10:05:59.546365 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6g2kf/must-gather-86tjn" podStartSLOduration=2.546339408 podStartE2EDuration="2.546339408s" podCreationTimestamp="2026-02-17 10:05:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 10:05:59.541743508 +0000 UTC m=+3637.084999154" watchObservedRunningTime="2026-02-17 10:05:59.546339408 +0000 UTC m=+3637.089595064" Feb 17 10:06:01 crc kubenswrapper[4848]: I0217 10:06:01.771604 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6g2kf/crc-debug-vwscf"] Feb 17 10:06:01 crc kubenswrapper[4848]: I0217 10:06:01.773349 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6g2kf/crc-debug-vwscf" Feb 17 10:06:01 crc kubenswrapper[4848]: I0217 10:06:01.839870 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/54b85d50-af38-4bd5-a1c6-759556796d64-host\") pod \"crc-debug-vwscf\" (UID: \"54b85d50-af38-4bd5-a1c6-759556796d64\") " pod="openshift-must-gather-6g2kf/crc-debug-vwscf" Feb 17 10:06:01 crc kubenswrapper[4848]: I0217 10:06:01.840279 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6gml\" (UniqueName: \"kubernetes.io/projected/54b85d50-af38-4bd5-a1c6-759556796d64-kube-api-access-g6gml\") pod \"crc-debug-vwscf\" (UID: \"54b85d50-af38-4bd5-a1c6-759556796d64\") " pod="openshift-must-gather-6g2kf/crc-debug-vwscf" Feb 17 10:06:01 crc kubenswrapper[4848]: I0217 10:06:01.941887 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/54b85d50-af38-4bd5-a1c6-759556796d64-host\") pod \"crc-debug-vwscf\" (UID: \"54b85d50-af38-4bd5-a1c6-759556796d64\") " pod="openshift-must-gather-6g2kf/crc-debug-vwscf" Feb 17 10:06:01 crc kubenswrapper[4848]: I0217 10:06:01.942056 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6gml\" (UniqueName: \"kubernetes.io/projected/54b85d50-af38-4bd5-a1c6-759556796d64-kube-api-access-g6gml\") pod \"crc-debug-vwscf\" (UID: \"54b85d50-af38-4bd5-a1c6-759556796d64\") " pod="openshift-must-gather-6g2kf/crc-debug-vwscf" Feb 17 10:06:01 crc kubenswrapper[4848]: I0217 10:06:01.942079 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/54b85d50-af38-4bd5-a1c6-759556796d64-host\") pod \"crc-debug-vwscf\" (UID: \"54b85d50-af38-4bd5-a1c6-759556796d64\") " pod="openshift-must-gather-6g2kf/crc-debug-vwscf" Feb 17 10:06:01 crc kubenswrapper[4848]: I0217 10:06:01.960674 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6gml\" (UniqueName: \"kubernetes.io/projected/54b85d50-af38-4bd5-a1c6-759556796d64-kube-api-access-g6gml\") pod \"crc-debug-vwscf\" (UID: \"54b85d50-af38-4bd5-a1c6-759556796d64\") " pod="openshift-must-gather-6g2kf/crc-debug-vwscf" Feb 17 10:06:02 crc kubenswrapper[4848]: I0217 10:06:02.097555 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6g2kf/crc-debug-vwscf" Feb 17 10:06:02 crc kubenswrapper[4848]: W0217 10:06:02.138276 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod54b85d50_af38_4bd5_a1c6_759556796d64.slice/crio-5651e24fb6e1e3c4de6bfbd7b39d0600172a51c4d1b38df368a63d1eaddc06ee WatchSource:0}: Error finding container 5651e24fb6e1e3c4de6bfbd7b39d0600172a51c4d1b38df368a63d1eaddc06ee: Status 404 returned error can't find the container with id 5651e24fb6e1e3c4de6bfbd7b39d0600172a51c4d1b38df368a63d1eaddc06ee Feb 17 10:06:02 crc kubenswrapper[4848]: I0217 10:06:02.544612 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6g2kf/crc-debug-vwscf" event={"ID":"54b85d50-af38-4bd5-a1c6-759556796d64","Type":"ContainerStarted","Data":"6823708f0e44c918a009b25b1d6766db8fa31e283ff2b07b90439578f04c032c"} Feb 17 10:06:02 crc kubenswrapper[4848]: I0217 10:06:02.545000 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6g2kf/crc-debug-vwscf" event={"ID":"54b85d50-af38-4bd5-a1c6-759556796d64","Type":"ContainerStarted","Data":"5651e24fb6e1e3c4de6bfbd7b39d0600172a51c4d1b38df368a63d1eaddc06ee"} Feb 17 10:06:02 crc kubenswrapper[4848]: I0217 10:06:02.563855 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6g2kf/crc-debug-vwscf" podStartSLOduration=1.5638327950000002 podStartE2EDuration="1.563832795s" podCreationTimestamp="2026-02-17 10:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 10:06:02.557561836 +0000 UTC m=+3640.100817492" watchObservedRunningTime="2026-02-17 10:06:02.563832795 +0000 UTC m=+3640.107088451" Feb 17 10:06:35 crc kubenswrapper[4848]: I0217 10:06:35.816077 4848 generic.go:334] "Generic (PLEG): container finished" podID="54b85d50-af38-4bd5-a1c6-759556796d64" containerID="6823708f0e44c918a009b25b1d6766db8fa31e283ff2b07b90439578f04c032c" exitCode=0 Feb 17 10:06:35 crc kubenswrapper[4848]: I0217 10:06:35.816126 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6g2kf/crc-debug-vwscf" event={"ID":"54b85d50-af38-4bd5-a1c6-759556796d64","Type":"ContainerDied","Data":"6823708f0e44c918a009b25b1d6766db8fa31e283ff2b07b90439578f04c032c"} Feb 17 10:06:36 crc kubenswrapper[4848]: I0217 10:06:36.945524 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6g2kf/crc-debug-vwscf" Feb 17 10:06:36 crc kubenswrapper[4848]: I0217 10:06:36.984472 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6g2kf/crc-debug-vwscf"] Feb 17 10:06:36 crc kubenswrapper[4848]: I0217 10:06:36.993443 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6g2kf/crc-debug-vwscf"] Feb 17 10:06:37 crc kubenswrapper[4848]: I0217 10:06:37.094465 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6gml\" (UniqueName: \"kubernetes.io/projected/54b85d50-af38-4bd5-a1c6-759556796d64-kube-api-access-g6gml\") pod \"54b85d50-af38-4bd5-a1c6-759556796d64\" (UID: \"54b85d50-af38-4bd5-a1c6-759556796d64\") " Feb 17 10:06:37 crc kubenswrapper[4848]: I0217 10:06:37.094940 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/54b85d50-af38-4bd5-a1c6-759556796d64-host\") pod \"54b85d50-af38-4bd5-a1c6-759556796d64\" (UID: \"54b85d50-af38-4bd5-a1c6-759556796d64\") " Feb 17 10:06:37 crc kubenswrapper[4848]: I0217 10:06:37.095543 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/54b85d50-af38-4bd5-a1c6-759556796d64-host" (OuterVolumeSpecName: "host") pod "54b85d50-af38-4bd5-a1c6-759556796d64" (UID: "54b85d50-af38-4bd5-a1c6-759556796d64"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 10:06:37 crc kubenswrapper[4848]: I0217 10:06:37.099847 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54b85d50-af38-4bd5-a1c6-759556796d64-kube-api-access-g6gml" (OuterVolumeSpecName: "kube-api-access-g6gml") pod "54b85d50-af38-4bd5-a1c6-759556796d64" (UID: "54b85d50-af38-4bd5-a1c6-759556796d64"). InnerVolumeSpecName "kube-api-access-g6gml". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 10:06:37 crc kubenswrapper[4848]: I0217 10:06:37.197633 4848 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/54b85d50-af38-4bd5-a1c6-759556796d64-host\") on node \"crc\" DevicePath \"\"" Feb 17 10:06:37 crc kubenswrapper[4848]: I0217 10:06:37.197688 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6gml\" (UniqueName: \"kubernetes.io/projected/54b85d50-af38-4bd5-a1c6-759556796d64-kube-api-access-g6gml\") on node \"crc\" DevicePath \"\"" Feb 17 10:06:37 crc kubenswrapper[4848]: I0217 10:06:37.398396 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54b85d50-af38-4bd5-a1c6-759556796d64" path="/var/lib/kubelet/pods/54b85d50-af38-4bd5-a1c6-759556796d64/volumes" Feb 17 10:06:37 crc kubenswrapper[4848]: I0217 10:06:37.833200 4848 scope.go:117] "RemoveContainer" containerID="6823708f0e44c918a009b25b1d6766db8fa31e283ff2b07b90439578f04c032c" Feb 17 10:06:37 crc kubenswrapper[4848]: I0217 10:06:37.833254 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6g2kf/crc-debug-vwscf" Feb 17 10:06:38 crc kubenswrapper[4848]: I0217 10:06:38.228322 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6g2kf/crc-debug-cfgm4"] Feb 17 10:06:38 crc kubenswrapper[4848]: E0217 10:06:38.228672 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54b85d50-af38-4bd5-a1c6-759556796d64" containerName="container-00" Feb 17 10:06:38 crc kubenswrapper[4848]: I0217 10:06:38.228682 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="54b85d50-af38-4bd5-a1c6-759556796d64" containerName="container-00" Feb 17 10:06:38 crc kubenswrapper[4848]: I0217 10:06:38.228938 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="54b85d50-af38-4bd5-a1c6-759556796d64" containerName="container-00" Feb 17 10:06:38 crc kubenswrapper[4848]: I0217 10:06:38.229486 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6g2kf/crc-debug-cfgm4" Feb 17 10:06:38 crc kubenswrapper[4848]: I0217 10:06:38.317778 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgcr7\" (UniqueName: \"kubernetes.io/projected/2938f11c-ccb3-4ee6-95d4-10191cbed968-kube-api-access-fgcr7\") pod \"crc-debug-cfgm4\" (UID: \"2938f11c-ccb3-4ee6-95d4-10191cbed968\") " pod="openshift-must-gather-6g2kf/crc-debug-cfgm4" Feb 17 10:06:38 crc kubenswrapper[4848]: I0217 10:06:38.317840 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2938f11c-ccb3-4ee6-95d4-10191cbed968-host\") pod \"crc-debug-cfgm4\" (UID: \"2938f11c-ccb3-4ee6-95d4-10191cbed968\") " pod="openshift-must-gather-6g2kf/crc-debug-cfgm4" Feb 17 10:06:38 crc kubenswrapper[4848]: I0217 10:06:38.419120 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgcr7\" (UniqueName: \"kubernetes.io/projected/2938f11c-ccb3-4ee6-95d4-10191cbed968-kube-api-access-fgcr7\") pod \"crc-debug-cfgm4\" (UID: \"2938f11c-ccb3-4ee6-95d4-10191cbed968\") " pod="openshift-must-gather-6g2kf/crc-debug-cfgm4" Feb 17 10:06:38 crc kubenswrapper[4848]: I0217 10:06:38.419459 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2938f11c-ccb3-4ee6-95d4-10191cbed968-host\") pod \"crc-debug-cfgm4\" (UID: \"2938f11c-ccb3-4ee6-95d4-10191cbed968\") " pod="openshift-must-gather-6g2kf/crc-debug-cfgm4" Feb 17 10:06:38 crc kubenswrapper[4848]: I0217 10:06:38.419649 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2938f11c-ccb3-4ee6-95d4-10191cbed968-host\") pod \"crc-debug-cfgm4\" (UID: \"2938f11c-ccb3-4ee6-95d4-10191cbed968\") " pod="openshift-must-gather-6g2kf/crc-debug-cfgm4" Feb 17 10:06:38 crc kubenswrapper[4848]: I0217 10:06:38.447672 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgcr7\" (UniqueName: \"kubernetes.io/projected/2938f11c-ccb3-4ee6-95d4-10191cbed968-kube-api-access-fgcr7\") pod \"crc-debug-cfgm4\" (UID: \"2938f11c-ccb3-4ee6-95d4-10191cbed968\") " pod="openshift-must-gather-6g2kf/crc-debug-cfgm4" Feb 17 10:06:38 crc kubenswrapper[4848]: I0217 10:06:38.544094 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6g2kf/crc-debug-cfgm4" Feb 17 10:06:38 crc kubenswrapper[4848]: I0217 10:06:38.843399 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6g2kf/crc-debug-cfgm4" event={"ID":"2938f11c-ccb3-4ee6-95d4-10191cbed968","Type":"ContainerStarted","Data":"2f36a181c44d3f52f60541d4cde6c4013dbd5d6a4443c40a1806a3cc8edf93fd"} Feb 17 10:06:38 crc kubenswrapper[4848]: I0217 10:06:38.843781 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6g2kf/crc-debug-cfgm4" event={"ID":"2938f11c-ccb3-4ee6-95d4-10191cbed968","Type":"ContainerStarted","Data":"f8e3d2d9481da2839b55d9efc71d8354dc62c900715a432544ce390b5e1801e5"} Feb 17 10:06:38 crc kubenswrapper[4848]: I0217 10:06:38.859711 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6g2kf/crc-debug-cfgm4" podStartSLOduration=0.859693925 podStartE2EDuration="859.693925ms" podCreationTimestamp="2026-02-17 10:06:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-17 10:06:38.856096053 +0000 UTC m=+3676.399351699" watchObservedRunningTime="2026-02-17 10:06:38.859693925 +0000 UTC m=+3676.402949571" Feb 17 10:06:39 crc kubenswrapper[4848]: I0217 10:06:39.854936 4848 generic.go:334] "Generic (PLEG): container finished" podID="2938f11c-ccb3-4ee6-95d4-10191cbed968" containerID="2f36a181c44d3f52f60541d4cde6c4013dbd5d6a4443c40a1806a3cc8edf93fd" exitCode=0 Feb 17 10:06:39 crc kubenswrapper[4848]: I0217 10:06:39.854997 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6g2kf/crc-debug-cfgm4" event={"ID":"2938f11c-ccb3-4ee6-95d4-10191cbed968","Type":"ContainerDied","Data":"2f36a181c44d3f52f60541d4cde6c4013dbd5d6a4443c40a1806a3cc8edf93fd"} Feb 17 10:06:40 crc kubenswrapper[4848]: I0217 10:06:40.973689 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6g2kf/crc-debug-cfgm4" Feb 17 10:06:41 crc kubenswrapper[4848]: I0217 10:06:41.005050 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6g2kf/crc-debug-cfgm4"] Feb 17 10:06:41 crc kubenswrapper[4848]: I0217 10:06:41.013409 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6g2kf/crc-debug-cfgm4"] Feb 17 10:06:41 crc kubenswrapper[4848]: I0217 10:06:41.095693 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgcr7\" (UniqueName: \"kubernetes.io/projected/2938f11c-ccb3-4ee6-95d4-10191cbed968-kube-api-access-fgcr7\") pod \"2938f11c-ccb3-4ee6-95d4-10191cbed968\" (UID: \"2938f11c-ccb3-4ee6-95d4-10191cbed968\") " Feb 17 10:06:41 crc kubenswrapper[4848]: I0217 10:06:41.095779 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2938f11c-ccb3-4ee6-95d4-10191cbed968-host\") pod \"2938f11c-ccb3-4ee6-95d4-10191cbed968\" (UID: \"2938f11c-ccb3-4ee6-95d4-10191cbed968\") " Feb 17 10:06:41 crc kubenswrapper[4848]: I0217 10:06:41.095910 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2938f11c-ccb3-4ee6-95d4-10191cbed968-host" (OuterVolumeSpecName: "host") pod "2938f11c-ccb3-4ee6-95d4-10191cbed968" (UID: "2938f11c-ccb3-4ee6-95d4-10191cbed968"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 10:06:41 crc kubenswrapper[4848]: I0217 10:06:41.096476 4848 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2938f11c-ccb3-4ee6-95d4-10191cbed968-host\") on node \"crc\" DevicePath \"\"" Feb 17 10:06:41 crc kubenswrapper[4848]: I0217 10:06:41.114425 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2938f11c-ccb3-4ee6-95d4-10191cbed968-kube-api-access-fgcr7" (OuterVolumeSpecName: "kube-api-access-fgcr7") pod "2938f11c-ccb3-4ee6-95d4-10191cbed968" (UID: "2938f11c-ccb3-4ee6-95d4-10191cbed968"). InnerVolumeSpecName "kube-api-access-fgcr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 10:06:41 crc kubenswrapper[4848]: I0217 10:06:41.199120 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgcr7\" (UniqueName: \"kubernetes.io/projected/2938f11c-ccb3-4ee6-95d4-10191cbed968-kube-api-access-fgcr7\") on node \"crc\" DevicePath \"\"" Feb 17 10:06:41 crc kubenswrapper[4848]: I0217 10:06:41.393313 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2938f11c-ccb3-4ee6-95d4-10191cbed968" path="/var/lib/kubelet/pods/2938f11c-ccb3-4ee6-95d4-10191cbed968/volumes" Feb 17 10:06:41 crc kubenswrapper[4848]: I0217 10:06:41.874749 4848 scope.go:117] "RemoveContainer" containerID="2f36a181c44d3f52f60541d4cde6c4013dbd5d6a4443c40a1806a3cc8edf93fd" Feb 17 10:06:41 crc kubenswrapper[4848]: I0217 10:06:41.874900 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6g2kf/crc-debug-cfgm4" Feb 17 10:06:42 crc kubenswrapper[4848]: I0217 10:06:42.177931 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6g2kf/crc-debug-f58k2"] Feb 17 10:06:42 crc kubenswrapper[4848]: E0217 10:06:42.178296 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2938f11c-ccb3-4ee6-95d4-10191cbed968" containerName="container-00" Feb 17 10:06:42 crc kubenswrapper[4848]: I0217 10:06:42.178308 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="2938f11c-ccb3-4ee6-95d4-10191cbed968" containerName="container-00" Feb 17 10:06:42 crc kubenswrapper[4848]: I0217 10:06:42.178497 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="2938f11c-ccb3-4ee6-95d4-10191cbed968" containerName="container-00" Feb 17 10:06:42 crc kubenswrapper[4848]: I0217 10:06:42.179771 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6g2kf/crc-debug-f58k2" Feb 17 10:06:42 crc kubenswrapper[4848]: I0217 10:06:42.215430 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v6w6\" (UniqueName: \"kubernetes.io/projected/b8efe23f-eb26-4c3e-b482-3ac62d4a766b-kube-api-access-6v6w6\") pod \"crc-debug-f58k2\" (UID: \"b8efe23f-eb26-4c3e-b482-3ac62d4a766b\") " pod="openshift-must-gather-6g2kf/crc-debug-f58k2" Feb 17 10:06:42 crc kubenswrapper[4848]: I0217 10:06:42.215557 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8efe23f-eb26-4c3e-b482-3ac62d4a766b-host\") pod \"crc-debug-f58k2\" (UID: \"b8efe23f-eb26-4c3e-b482-3ac62d4a766b\") " pod="openshift-must-gather-6g2kf/crc-debug-f58k2" Feb 17 10:06:42 crc kubenswrapper[4848]: I0217 10:06:42.317021 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8efe23f-eb26-4c3e-b482-3ac62d4a766b-host\") pod \"crc-debug-f58k2\" (UID: \"b8efe23f-eb26-4c3e-b482-3ac62d4a766b\") " pod="openshift-must-gather-6g2kf/crc-debug-f58k2" Feb 17 10:06:42 crc kubenswrapper[4848]: I0217 10:06:42.317256 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v6w6\" (UniqueName: \"kubernetes.io/projected/b8efe23f-eb26-4c3e-b482-3ac62d4a766b-kube-api-access-6v6w6\") pod \"crc-debug-f58k2\" (UID: \"b8efe23f-eb26-4c3e-b482-3ac62d4a766b\") " pod="openshift-must-gather-6g2kf/crc-debug-f58k2" Feb 17 10:06:42 crc kubenswrapper[4848]: I0217 10:06:42.317787 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8efe23f-eb26-4c3e-b482-3ac62d4a766b-host\") pod \"crc-debug-f58k2\" (UID: \"b8efe23f-eb26-4c3e-b482-3ac62d4a766b\") " pod="openshift-must-gather-6g2kf/crc-debug-f58k2" Feb 17 10:06:42 crc kubenswrapper[4848]: I0217 10:06:42.345390 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v6w6\" (UniqueName: \"kubernetes.io/projected/b8efe23f-eb26-4c3e-b482-3ac62d4a766b-kube-api-access-6v6w6\") pod \"crc-debug-f58k2\" (UID: \"b8efe23f-eb26-4c3e-b482-3ac62d4a766b\") " pod="openshift-must-gather-6g2kf/crc-debug-f58k2" Feb 17 10:06:42 crc kubenswrapper[4848]: I0217 10:06:42.504833 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6g2kf/crc-debug-f58k2" Feb 17 10:06:42 crc kubenswrapper[4848]: W0217 10:06:42.531137 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8efe23f_eb26_4c3e_b482_3ac62d4a766b.slice/crio-fafed8d93f9aaeca3762c91ffe05c4ac3d7408df290a97420e14f1ecbe8d7602 WatchSource:0}: Error finding container fafed8d93f9aaeca3762c91ffe05c4ac3d7408df290a97420e14f1ecbe8d7602: Status 404 returned error can't find the container with id fafed8d93f9aaeca3762c91ffe05c4ac3d7408df290a97420e14f1ecbe8d7602 Feb 17 10:06:42 crc kubenswrapper[4848]: I0217 10:06:42.885639 4848 generic.go:334] "Generic (PLEG): container finished" podID="b8efe23f-eb26-4c3e-b482-3ac62d4a766b" containerID="81c22e69ce0ebbaf294c4dc53be91970fb4911ecabc6bc4c849ae17e64ffe58a" exitCode=0 Feb 17 10:06:42 crc kubenswrapper[4848]: I0217 10:06:42.885708 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6g2kf/crc-debug-f58k2" event={"ID":"b8efe23f-eb26-4c3e-b482-3ac62d4a766b","Type":"ContainerDied","Data":"81c22e69ce0ebbaf294c4dc53be91970fb4911ecabc6bc4c849ae17e64ffe58a"} Feb 17 10:06:42 crc kubenswrapper[4848]: I0217 10:06:42.885962 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6g2kf/crc-debug-f58k2" event={"ID":"b8efe23f-eb26-4c3e-b482-3ac62d4a766b","Type":"ContainerStarted","Data":"fafed8d93f9aaeca3762c91ffe05c4ac3d7408df290a97420e14f1ecbe8d7602"} Feb 17 10:06:42 crc kubenswrapper[4848]: I0217 10:06:42.924813 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6g2kf/crc-debug-f58k2"] Feb 17 10:06:42 crc kubenswrapper[4848]: I0217 10:06:42.932643 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6g2kf/crc-debug-f58k2"] Feb 17 10:06:44 crc kubenswrapper[4848]: I0217 10:06:44.015076 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6g2kf/crc-debug-f58k2" Feb 17 10:06:44 crc kubenswrapper[4848]: I0217 10:06:44.152408 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8efe23f-eb26-4c3e-b482-3ac62d4a766b-host\") pod \"b8efe23f-eb26-4c3e-b482-3ac62d4a766b\" (UID: \"b8efe23f-eb26-4c3e-b482-3ac62d4a766b\") " Feb 17 10:06:44 crc kubenswrapper[4848]: I0217 10:06:44.152595 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v6w6\" (UniqueName: \"kubernetes.io/projected/b8efe23f-eb26-4c3e-b482-3ac62d4a766b-kube-api-access-6v6w6\") pod \"b8efe23f-eb26-4c3e-b482-3ac62d4a766b\" (UID: \"b8efe23f-eb26-4c3e-b482-3ac62d4a766b\") " Feb 17 10:06:44 crc kubenswrapper[4848]: I0217 10:06:44.153026 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8efe23f-eb26-4c3e-b482-3ac62d4a766b-host" (OuterVolumeSpecName: "host") pod "b8efe23f-eb26-4c3e-b482-3ac62d4a766b" (UID: "b8efe23f-eb26-4c3e-b482-3ac62d4a766b"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 17 10:06:44 crc kubenswrapper[4848]: I0217 10:06:44.158248 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8efe23f-eb26-4c3e-b482-3ac62d4a766b-kube-api-access-6v6w6" (OuterVolumeSpecName: "kube-api-access-6v6w6") pod "b8efe23f-eb26-4c3e-b482-3ac62d4a766b" (UID: "b8efe23f-eb26-4c3e-b482-3ac62d4a766b"). InnerVolumeSpecName "kube-api-access-6v6w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 10:06:44 crc kubenswrapper[4848]: I0217 10:06:44.255049 4848 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b8efe23f-eb26-4c3e-b482-3ac62d4a766b-host\") on node \"crc\" DevicePath \"\"" Feb 17 10:06:44 crc kubenswrapper[4848]: I0217 10:06:44.255098 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v6w6\" (UniqueName: \"kubernetes.io/projected/b8efe23f-eb26-4c3e-b482-3ac62d4a766b-kube-api-access-6v6w6\") on node \"crc\" DevicePath \"\"" Feb 17 10:06:44 crc kubenswrapper[4848]: I0217 10:06:44.929280 4848 scope.go:117] "RemoveContainer" containerID="81c22e69ce0ebbaf294c4dc53be91970fb4911ecabc6bc4c849ae17e64ffe58a" Feb 17 10:06:44 crc kubenswrapper[4848]: I0217 10:06:44.929423 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6g2kf/crc-debug-f58k2" Feb 17 10:06:45 crc kubenswrapper[4848]: I0217 10:06:45.393905 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8efe23f-eb26-4c3e-b482-3ac62d4a766b" path="/var/lib/kubelet/pods/b8efe23f-eb26-4c3e-b482-3ac62d4a766b/volumes" Feb 17 10:07:06 crc kubenswrapper[4848]: I0217 10:07:06.661307 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z79v4"] Feb 17 10:07:06 crc kubenswrapper[4848]: E0217 10:07:06.662613 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8efe23f-eb26-4c3e-b482-3ac62d4a766b" containerName="container-00" Feb 17 10:07:06 crc kubenswrapper[4848]: I0217 10:07:06.662635 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8efe23f-eb26-4c3e-b482-3ac62d4a766b" containerName="container-00" Feb 17 10:07:06 crc kubenswrapper[4848]: I0217 10:07:06.662918 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8efe23f-eb26-4c3e-b482-3ac62d4a766b" containerName="container-00" Feb 17 10:07:06 crc kubenswrapper[4848]: I0217 10:07:06.664589 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z79v4" Feb 17 10:07:06 crc kubenswrapper[4848]: I0217 10:07:06.675469 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z79v4"] Feb 17 10:07:06 crc kubenswrapper[4848]: I0217 10:07:06.745292 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4db3fa5-1874-4e1f-8db3-0507e8610158-utilities\") pod \"redhat-marketplace-z79v4\" (UID: \"e4db3fa5-1874-4e1f-8db3-0507e8610158\") " pod="openshift-marketplace/redhat-marketplace-z79v4" Feb 17 10:07:06 crc kubenswrapper[4848]: I0217 10:07:06.745328 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmd5l\" (UniqueName: \"kubernetes.io/projected/e4db3fa5-1874-4e1f-8db3-0507e8610158-kube-api-access-kmd5l\") pod \"redhat-marketplace-z79v4\" (UID: \"e4db3fa5-1874-4e1f-8db3-0507e8610158\") " pod="openshift-marketplace/redhat-marketplace-z79v4" Feb 17 10:07:06 crc kubenswrapper[4848]: I0217 10:07:06.745422 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4db3fa5-1874-4e1f-8db3-0507e8610158-catalog-content\") pod \"redhat-marketplace-z79v4\" (UID: \"e4db3fa5-1874-4e1f-8db3-0507e8610158\") " pod="openshift-marketplace/redhat-marketplace-z79v4" Feb 17 10:07:06 crc kubenswrapper[4848]: I0217 10:07:06.846016 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4db3fa5-1874-4e1f-8db3-0507e8610158-utilities\") pod \"redhat-marketplace-z79v4\" (UID: \"e4db3fa5-1874-4e1f-8db3-0507e8610158\") " pod="openshift-marketplace/redhat-marketplace-z79v4" Feb 17 10:07:06 crc kubenswrapper[4848]: I0217 10:07:06.846069 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmd5l\" (UniqueName: \"kubernetes.io/projected/e4db3fa5-1874-4e1f-8db3-0507e8610158-kube-api-access-kmd5l\") pod \"redhat-marketplace-z79v4\" (UID: \"e4db3fa5-1874-4e1f-8db3-0507e8610158\") " pod="openshift-marketplace/redhat-marketplace-z79v4" Feb 17 10:07:06 crc kubenswrapper[4848]: I0217 10:07:06.846152 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4db3fa5-1874-4e1f-8db3-0507e8610158-catalog-content\") pod \"redhat-marketplace-z79v4\" (UID: \"e4db3fa5-1874-4e1f-8db3-0507e8610158\") " pod="openshift-marketplace/redhat-marketplace-z79v4" Feb 17 10:07:06 crc kubenswrapper[4848]: I0217 10:07:06.846568 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4db3fa5-1874-4e1f-8db3-0507e8610158-utilities\") pod \"redhat-marketplace-z79v4\" (UID: \"e4db3fa5-1874-4e1f-8db3-0507e8610158\") " pod="openshift-marketplace/redhat-marketplace-z79v4" Feb 17 10:07:06 crc kubenswrapper[4848]: I0217 10:07:06.846599 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4db3fa5-1874-4e1f-8db3-0507e8610158-catalog-content\") pod \"redhat-marketplace-z79v4\" (UID: \"e4db3fa5-1874-4e1f-8db3-0507e8610158\") " pod="openshift-marketplace/redhat-marketplace-z79v4" Feb 17 10:07:06 crc kubenswrapper[4848]: I0217 10:07:06.863742 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmd5l\" (UniqueName: \"kubernetes.io/projected/e4db3fa5-1874-4e1f-8db3-0507e8610158-kube-api-access-kmd5l\") pod \"redhat-marketplace-z79v4\" (UID: \"e4db3fa5-1874-4e1f-8db3-0507e8610158\") " pod="openshift-marketplace/redhat-marketplace-z79v4" Feb 17 10:07:07 crc kubenswrapper[4848]: I0217 10:07:07.001123 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z79v4" Feb 17 10:07:07 crc kubenswrapper[4848]: I0217 10:07:07.524952 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z79v4"] Feb 17 10:07:08 crc kubenswrapper[4848]: I0217 10:07:08.161381 4848 generic.go:334] "Generic (PLEG): container finished" podID="e4db3fa5-1874-4e1f-8db3-0507e8610158" containerID="355034d7dfe95f22e80405c49b71e09f61ddd35b71ea3c4f1a694345386016d3" exitCode=0 Feb 17 10:07:08 crc kubenswrapper[4848]: I0217 10:07:08.161428 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z79v4" event={"ID":"e4db3fa5-1874-4e1f-8db3-0507e8610158","Type":"ContainerDied","Data":"355034d7dfe95f22e80405c49b71e09f61ddd35b71ea3c4f1a694345386016d3"} Feb 17 10:07:08 crc kubenswrapper[4848]: I0217 10:07:08.161701 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z79v4" event={"ID":"e4db3fa5-1874-4e1f-8db3-0507e8610158","Type":"ContainerStarted","Data":"06dd0e5408aecdde7d6713baa1039274304aff57838b47c2fb7036d5008150aa"} Feb 17 10:07:09 crc kubenswrapper[4848]: I0217 10:07:09.176208 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z79v4" event={"ID":"e4db3fa5-1874-4e1f-8db3-0507e8610158","Type":"ContainerStarted","Data":"d59cc8ab40d0fbc80af224b2b9bb981645926da2113ea48204fc2bc6a75f94e8"} Feb 17 10:07:10 crc kubenswrapper[4848]: I0217 10:07:10.186345 4848 generic.go:334] "Generic (PLEG): container finished" podID="e4db3fa5-1874-4e1f-8db3-0507e8610158" containerID="d59cc8ab40d0fbc80af224b2b9bb981645926da2113ea48204fc2bc6a75f94e8" exitCode=0 Feb 17 10:07:10 crc kubenswrapper[4848]: I0217 10:07:10.186409 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z79v4" event={"ID":"e4db3fa5-1874-4e1f-8db3-0507e8610158","Type":"ContainerDied","Data":"d59cc8ab40d0fbc80af224b2b9bb981645926da2113ea48204fc2bc6a75f94e8"} Feb 17 10:07:11 crc kubenswrapper[4848]: I0217 10:07:11.197732 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z79v4" event={"ID":"e4db3fa5-1874-4e1f-8db3-0507e8610158","Type":"ContainerStarted","Data":"0b99b8078e54663499906d14724c7f4b05d6f9dafc1bed77a1145dd8505eecc6"} Feb 17 10:07:11 crc kubenswrapper[4848]: I0217 10:07:11.243930 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z79v4" podStartSLOduration=2.8410100529999998 podStartE2EDuration="5.243910242s" podCreationTimestamp="2026-02-17 10:07:06 +0000 UTC" firstStartedPulling="2026-02-17 10:07:08.165600226 +0000 UTC m=+3705.708855902" lastFinishedPulling="2026-02-17 10:07:10.568500445 +0000 UTC m=+3708.111756091" observedRunningTime="2026-02-17 10:07:11.236441379 +0000 UTC m=+3708.779697025" watchObservedRunningTime="2026-02-17 10:07:11.243910242 +0000 UTC m=+3708.787165888" Feb 17 10:07:11 crc kubenswrapper[4848]: I0217 10:07:11.655362 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56c75f4b6d-vbll8_63d2c1b3-b181-4afe-8cb1-3049a34c47d2/barbican-api/0.log" Feb 17 10:07:11 crc kubenswrapper[4848]: I0217 10:07:11.703850 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-56c75f4b6d-vbll8_63d2c1b3-b181-4afe-8cb1-3049a34c47d2/barbican-api-log/0.log" Feb 17 10:07:12 crc kubenswrapper[4848]: I0217 10:07:12.027043 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6c677d9df8-z5nnn_49b9137d-75ca-4b52-9338-6bf15270a667/barbican-keystone-listener-log/0.log" Feb 17 10:07:12 crc kubenswrapper[4848]: I0217 10:07:12.136583 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6c677d9df8-z5nnn_49b9137d-75ca-4b52-9338-6bf15270a667/barbican-keystone-listener/0.log" Feb 17 10:07:12 crc kubenswrapper[4848]: I0217 10:07:12.292015 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-848f449699-2nhmn_dd92adeb-535d-4d36-a176-b5cd3ca667dc/barbican-worker/0.log" Feb 17 10:07:12 crc kubenswrapper[4848]: I0217 10:07:12.441363 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-848f449699-2nhmn_dd92adeb-535d-4d36-a176-b5cd3ca667dc/barbican-worker-log/0.log" Feb 17 10:07:12 crc kubenswrapper[4848]: I0217 10:07:12.450973 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-2hvp2_9eaf40fa-e0f2-445b-a17b-98f88fc76a5e/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 10:07:12 crc kubenswrapper[4848]: I0217 10:07:12.631602 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_65294004-8016-43fa-8017-90cb36bb8dcb/ceilometer-central-agent/0.log" Feb 17 10:07:12 crc kubenswrapper[4848]: I0217 10:07:12.644291 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_65294004-8016-43fa-8017-90cb36bb8dcb/ceilometer-notification-agent/0.log" Feb 17 10:07:12 crc kubenswrapper[4848]: I0217 10:07:12.665370 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_65294004-8016-43fa-8017-90cb36bb8dcb/proxy-httpd/0.log" Feb 17 10:07:12 crc kubenswrapper[4848]: I0217 10:07:12.792694 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_65294004-8016-43fa-8017-90cb36bb8dcb/sg-core/0.log" Feb 17 10:07:12 crc kubenswrapper[4848]: I0217 10:07:12.857069 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a637d69f-499a-4308-89a8-fad8fe4e6d59/cinder-api/0.log" Feb 17 10:07:12 crc kubenswrapper[4848]: I0217 10:07:12.983801 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_a637d69f-499a-4308-89a8-fad8fe4e6d59/cinder-api-log/0.log" Feb 17 10:07:13 crc kubenswrapper[4848]: I0217 10:07:13.100440 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_96a17dca-14f1-42ed-aca6-45fc15067cd3/cinder-scheduler/0.log" Feb 17 10:07:13 crc kubenswrapper[4848]: I0217 10:07:13.127541 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_96a17dca-14f1-42ed-aca6-45fc15067cd3/probe/0.log" Feb 17 10:07:13 crc kubenswrapper[4848]: I0217 10:07:13.262571 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-ln4pv_9c1fceab-33b4-4eee-8e26-c9bc2a35f018/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 10:07:13 crc kubenswrapper[4848]: I0217 10:07:13.376062 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-pqx9h_cbf54dd4-b933-400a-bef2-44bc87fbf3de/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 10:07:13 crc kubenswrapper[4848]: I0217 10:07:13.492961 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-b9df5dcdc-8rdwv_84c45378-b510-419e-83b7-b92a19292d39/init/0.log" Feb 17 10:07:13 crc kubenswrapper[4848]: I0217 10:07:13.727077 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-b9df5dcdc-8rdwv_84c45378-b510-419e-83b7-b92a19292d39/dnsmasq-dns/0.log" Feb 17 10:07:13 crc kubenswrapper[4848]: I0217 10:07:13.765396 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-b9df5dcdc-8rdwv_84c45378-b510-419e-83b7-b92a19292d39/init/0.log" Feb 17 10:07:13 crc kubenswrapper[4848]: I0217 10:07:13.775524 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-nztrh_502cc85d-fb83-4c34-825f-5aca6c880af7/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 10:07:13 crc kubenswrapper[4848]: I0217 10:07:13.971232 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379/glance-log/0.log" Feb 17 10:07:13 crc kubenswrapper[4848]: I0217 10:07:13.981976 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_ebc27fb3-ae13-4cc0-814d-ce4b3ecf8379/glance-httpd/0.log" Feb 17 10:07:14 crc kubenswrapper[4848]: I0217 10:07:14.161603 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5f89c214-b934-465f-86ee-dec5f742237e/glance-httpd/0.log" Feb 17 10:07:14 crc kubenswrapper[4848]: I0217 10:07:14.248475 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_5f89c214-b934-465f-86ee-dec5f742237e/glance-log/0.log" Feb 17 10:07:14 crc kubenswrapper[4848]: I0217 10:07:14.289379 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-676bdd79dd-lq228_96fd6f0e-96ad-4a88-85ff-78f450b24279/horizon/0.log" Feb 17 10:07:14 crc kubenswrapper[4848]: I0217 10:07:14.540060 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-t7g8s_3b1521fa-5ab7-4b0a-b6f3-0c810ca21c34/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 10:07:14 crc kubenswrapper[4848]: I0217 10:07:14.762453 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-zp8bp_55a2808a-1ccd-4a2b-bf2f-25e7ea8c069f/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 10:07:14 crc kubenswrapper[4848]: I0217 10:07:14.768155 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-676bdd79dd-lq228_96fd6f0e-96ad-4a88-85ff-78f450b24279/horizon-log/0.log" Feb 17 10:07:14 crc kubenswrapper[4848]: I0217 10:07:14.938236 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-76c7ffd8bf-x42cc_fc74976d-87c5-406c-9e25-4f89c5fc2307/keystone-api/0.log" Feb 17 10:07:14 crc kubenswrapper[4848]: I0217 10:07:14.997723 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29522041-h9qqk_5bcdc424-dcde-46e9-ac56-fe1b5330844c/keystone-cron/0.log" Feb 17 10:07:15 crc kubenswrapper[4848]: I0217 10:07:15.312439 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_10c3da4a-6e24-4b18-8c11-26d2255aebcc/kube-state-metrics/0.log" Feb 17 10:07:15 crc kubenswrapper[4848]: I0217 10:07:15.361987 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-4vlqk_a694769e-5bc0-4596-945c-2de9823168f0/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 10:07:15 crc kubenswrapper[4848]: I0217 10:07:15.691573 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-685b8f6845-8tvq5_e060d08b-cb90-4fe6-badb-ae482aeb505d/neutron-api/0.log" Feb 17 10:07:15 crc kubenswrapper[4848]: I0217 10:07:15.702618 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-f467db6f-7x6cx" podUID="abcdb3d8-da38-472a-bdb3-e1615f832970" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 17 10:07:15 crc kubenswrapper[4848]: I0217 10:07:15.718652 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-685b8f6845-8tvq5_e060d08b-cb90-4fe6-badb-ae482aeb505d/neutron-httpd/0.log" Feb 17 10:07:15 crc kubenswrapper[4848]: I0217 10:07:15.795484 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-5vs9b_cb01f719-b45c-48ab-ba4a-6ffeef0d8b92/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 10:07:16 crc kubenswrapper[4848]: I0217 10:07:16.342997 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7/nova-api-log/0.log" Feb 17 10:07:16 crc kubenswrapper[4848]: I0217 10:07:16.374321 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_4a8dbe1d-cea3-4cf7-a8ef-210410453732/nova-cell0-conductor-conductor/0.log" Feb 17 10:07:16 crc kubenswrapper[4848]: I0217 10:07:16.610858 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_828cf207-286c-427a-80f4-5713b1128ecc/nova-cell1-conductor-conductor/0.log" Feb 17 10:07:16 crc kubenswrapper[4848]: I0217 10:07:16.750567 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_3fbc4701-8538-4a9d-8e88-ecf63d3e8ac7/nova-api-api/0.log" Feb 17 10:07:16 crc kubenswrapper[4848]: I0217 10:07:16.753453 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_0b2f03bc-3c70-4dc8-9478-5474155fdf90/nova-cell1-novncproxy-novncproxy/0.log" Feb 17 10:07:16 crc kubenswrapper[4848]: I0217 10:07:16.852983 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-jnqsf_8b957c61-bd51-415a-9d34-da20cb8ebd55/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 10:07:17 crc kubenswrapper[4848]: I0217 10:07:17.002018 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z79v4" Feb 17 10:07:17 crc kubenswrapper[4848]: I0217 10:07:17.002057 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z79v4" Feb 17 10:07:17 crc kubenswrapper[4848]: I0217 10:07:17.030654 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_28ef5b18-4ce7-4850-b6c8-70e0727fc805/nova-metadata-log/0.log" Feb 17 10:07:17 crc kubenswrapper[4848]: I0217 10:07:17.066317 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z79v4" Feb 17 10:07:17 crc kubenswrapper[4848]: I0217 10:07:17.307126 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z79v4" Feb 17 10:07:17 crc kubenswrapper[4848]: I0217 10:07:17.353555 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z79v4"] Feb 17 10:07:17 crc kubenswrapper[4848]: I0217 10:07:17.424465 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_241bdede-0e36-4cfa-965b-89449d5f84f0/mysql-bootstrap/0.log" Feb 17 10:07:17 crc kubenswrapper[4848]: I0217 10:07:17.554058 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_627e2420-02b2-4269-adc9-573cd91cccd9/nova-scheduler-scheduler/0.log" Feb 17 10:07:17 crc kubenswrapper[4848]: I0217 10:07:17.601548 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_241bdede-0e36-4cfa-965b-89449d5f84f0/mysql-bootstrap/0.log" Feb 17 10:07:17 crc kubenswrapper[4848]: I0217 10:07:17.671278 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_241bdede-0e36-4cfa-965b-89449d5f84f0/galera/0.log" Feb 17 10:07:17 crc kubenswrapper[4848]: I0217 10:07:17.808737 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ac51f8f5-cf36-44ef-b849-9bd6265e5156/mysql-bootstrap/0.log" Feb 17 10:07:18 crc kubenswrapper[4848]: I0217 10:07:18.066733 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ac51f8f5-cf36-44ef-b849-9bd6265e5156/galera/0.log" Feb 17 10:07:18 crc kubenswrapper[4848]: I0217 10:07:18.130559 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ac51f8f5-cf36-44ef-b849-9bd6265e5156/mysql-bootstrap/0.log" Feb 17 10:07:18 crc kubenswrapper[4848]: I0217 10:07:18.306466 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_5211bb87-9d50-485f-aa61-43f8d57339c7/openstackclient/0.log" Feb 17 10:07:18 crc kubenswrapper[4848]: I0217 10:07:18.406776 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_28ef5b18-4ce7-4850-b6c8-70e0727fc805/nova-metadata-metadata/0.log" Feb 17 10:07:18 crc kubenswrapper[4848]: I0217 10:07:18.571790 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-c695f_43e80552-f64e-4257-a460-f108ee513c12/ovn-controller/0.log" Feb 17 10:07:18 crc kubenswrapper[4848]: I0217 10:07:18.744715 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-28cvn_1f7ecdca-433f-4bcc-a3d5-e433a8db3bad/openstack-network-exporter/0.log" Feb 17 10:07:18 crc kubenswrapper[4848]: I0217 10:07:18.788977 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jbwkv_e09d3b82-ad17-461a-89eb-b8ee45d4edff/ovsdb-server-init/0.log" Feb 17 10:07:19 crc kubenswrapper[4848]: I0217 10:07:19.049240 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jbwkv_e09d3b82-ad17-461a-89eb-b8ee45d4edff/ovsdb-server-init/0.log" Feb 17 10:07:19 crc kubenswrapper[4848]: I0217 10:07:19.057833 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jbwkv_e09d3b82-ad17-461a-89eb-b8ee45d4edff/ovsdb-server/0.log" Feb 17 10:07:19 crc kubenswrapper[4848]: I0217 10:07:19.083104 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-jbwkv_e09d3b82-ad17-461a-89eb-b8ee45d4edff/ovs-vswitchd/0.log" Feb 17 10:07:19 crc kubenswrapper[4848]: I0217 10:07:19.253520 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5263f1c0-e02a-4383-ae9f-3b223486a59e/openstack-network-exporter/0.log" Feb 17 10:07:19 crc kubenswrapper[4848]: I0217 10:07:19.265115 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z79v4" podUID="e4db3fa5-1874-4e1f-8db3-0507e8610158" containerName="registry-server" containerID="cri-o://0b99b8078e54663499906d14724c7f4b05d6f9dafc1bed77a1145dd8505eecc6" gracePeriod=2 Feb 17 10:07:19 crc kubenswrapper[4848]: I0217 10:07:19.343137 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-mq87l_3db20d69-cab8-4176-a71c-172899e90c3d/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 10:07:19 crc kubenswrapper[4848]: I0217 10:07:19.374038 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5263f1c0-e02a-4383-ae9f-3b223486a59e/ovn-northd/0.log" Feb 17 10:07:19 crc kubenswrapper[4848]: I0217 10:07:19.625934 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9ccc9c6f-4e19-464f-9e06-7a3951c63c85/openstack-network-exporter/0.log" Feb 17 10:07:19 crc kubenswrapper[4848]: I0217 10:07:19.694820 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_9ccc9c6f-4e19-464f-9e06-7a3951c63c85/ovsdbserver-nb/0.log" Feb 17 10:07:19 crc kubenswrapper[4848]: I0217 10:07:19.790780 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z79v4" Feb 17 10:07:19 crc kubenswrapper[4848]: I0217 10:07:19.869117 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0c03c6cc-b85f-465f-b692-8f50eaca7cd6/openstack-network-exporter/0.log" Feb 17 10:07:19 crc kubenswrapper[4848]: I0217 10:07:19.873753 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_0c03c6cc-b85f-465f-b692-8f50eaca7cd6/ovsdbserver-sb/0.log" Feb 17 10:07:19 crc kubenswrapper[4848]: I0217 10:07:19.891909 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4db3fa5-1874-4e1f-8db3-0507e8610158-catalog-content\") pod \"e4db3fa5-1874-4e1f-8db3-0507e8610158\" (UID: \"e4db3fa5-1874-4e1f-8db3-0507e8610158\") " Feb 17 10:07:19 crc kubenswrapper[4848]: I0217 10:07:19.892028 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmd5l\" (UniqueName: \"kubernetes.io/projected/e4db3fa5-1874-4e1f-8db3-0507e8610158-kube-api-access-kmd5l\") pod \"e4db3fa5-1874-4e1f-8db3-0507e8610158\" (UID: \"e4db3fa5-1874-4e1f-8db3-0507e8610158\") " Feb 17 10:07:19 crc kubenswrapper[4848]: I0217 10:07:19.892265 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4db3fa5-1874-4e1f-8db3-0507e8610158-utilities\") pod \"e4db3fa5-1874-4e1f-8db3-0507e8610158\" (UID: \"e4db3fa5-1874-4e1f-8db3-0507e8610158\") " Feb 17 10:07:19 crc kubenswrapper[4848]: I0217 10:07:19.894039 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4db3fa5-1874-4e1f-8db3-0507e8610158-utilities" (OuterVolumeSpecName: "utilities") pod "e4db3fa5-1874-4e1f-8db3-0507e8610158" (UID: "e4db3fa5-1874-4e1f-8db3-0507e8610158"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 10:07:19 crc kubenswrapper[4848]: I0217 10:07:19.923855 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4db3fa5-1874-4e1f-8db3-0507e8610158-kube-api-access-kmd5l" (OuterVolumeSpecName: "kube-api-access-kmd5l") pod "e4db3fa5-1874-4e1f-8db3-0507e8610158" (UID: "e4db3fa5-1874-4e1f-8db3-0507e8610158"). InnerVolumeSpecName "kube-api-access-kmd5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 10:07:19 crc kubenswrapper[4848]: I0217 10:07:19.940997 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4db3fa5-1874-4e1f-8db3-0507e8610158-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4db3fa5-1874-4e1f-8db3-0507e8610158" (UID: "e4db3fa5-1874-4e1f-8db3-0507e8610158"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 10:07:19 crc kubenswrapper[4848]: I0217 10:07:19.994609 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4db3fa5-1874-4e1f-8db3-0507e8610158-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 10:07:19 crc kubenswrapper[4848]: I0217 10:07:19.994889 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4db3fa5-1874-4e1f-8db3-0507e8610158-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 10:07:19 crc kubenswrapper[4848]: I0217 10:07:19.994989 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmd5l\" (UniqueName: \"kubernetes.io/projected/e4db3fa5-1874-4e1f-8db3-0507e8610158-kube-api-access-kmd5l\") on node \"crc\" DevicePath \"\"" Feb 17 10:07:20 crc kubenswrapper[4848]: I0217 10:07:20.062949 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7b6cdb54-tkxbl_eba884ff-2e19-4dca-ba2e-75a8a311ea19/placement-api/0.log" Feb 17 10:07:20 crc kubenswrapper[4848]: I0217 10:07:20.210350 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7b6cdb54-tkxbl_eba884ff-2e19-4dca-ba2e-75a8a311ea19/placement-log/0.log" Feb 17 10:07:20 crc kubenswrapper[4848]: I0217 10:07:20.248892 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b5fc69cf-e25e-4ca3-adc3-36b1678691e1/setup-container/0.log" Feb 17 10:07:20 crc kubenswrapper[4848]: I0217 10:07:20.278957 4848 generic.go:334] "Generic (PLEG): container finished" podID="e4db3fa5-1874-4e1f-8db3-0507e8610158" containerID="0b99b8078e54663499906d14724c7f4b05d6f9dafc1bed77a1145dd8505eecc6" exitCode=0 Feb 17 10:07:20 crc kubenswrapper[4848]: I0217 10:07:20.278997 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z79v4" event={"ID":"e4db3fa5-1874-4e1f-8db3-0507e8610158","Type":"ContainerDied","Data":"0b99b8078e54663499906d14724c7f4b05d6f9dafc1bed77a1145dd8505eecc6"} Feb 17 10:07:20 crc kubenswrapper[4848]: I0217 10:07:20.279020 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z79v4" event={"ID":"e4db3fa5-1874-4e1f-8db3-0507e8610158","Type":"ContainerDied","Data":"06dd0e5408aecdde7d6713baa1039274304aff57838b47c2fb7036d5008150aa"} Feb 17 10:07:20 crc kubenswrapper[4848]: I0217 10:07:20.279036 4848 scope.go:117] "RemoveContainer" containerID="0b99b8078e54663499906d14724c7f4b05d6f9dafc1bed77a1145dd8505eecc6" Feb 17 10:07:20 crc kubenswrapper[4848]: I0217 10:07:20.279180 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z79v4" Feb 17 10:07:20 crc kubenswrapper[4848]: I0217 10:07:20.303200 4848 scope.go:117] "RemoveContainer" containerID="d59cc8ab40d0fbc80af224b2b9bb981645926da2113ea48204fc2bc6a75f94e8" Feb 17 10:07:20 crc kubenswrapper[4848]: I0217 10:07:20.323503 4848 scope.go:117] "RemoveContainer" containerID="355034d7dfe95f22e80405c49b71e09f61ddd35b71ea3c4f1a694345386016d3" Feb 17 10:07:20 crc kubenswrapper[4848]: I0217 10:07:20.330595 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z79v4"] Feb 17 10:07:20 crc kubenswrapper[4848]: I0217 10:07:20.338544 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z79v4"] Feb 17 10:07:20 crc kubenswrapper[4848]: I0217 10:07:20.368778 4848 scope.go:117] "RemoveContainer" containerID="0b99b8078e54663499906d14724c7f4b05d6f9dafc1bed77a1145dd8505eecc6" Feb 17 10:07:20 crc kubenswrapper[4848]: E0217 10:07:20.376937 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b99b8078e54663499906d14724c7f4b05d6f9dafc1bed77a1145dd8505eecc6\": container with ID starting with 0b99b8078e54663499906d14724c7f4b05d6f9dafc1bed77a1145dd8505eecc6 not found: ID does not exist" containerID="0b99b8078e54663499906d14724c7f4b05d6f9dafc1bed77a1145dd8505eecc6" Feb 17 10:07:20 crc kubenswrapper[4848]: I0217 10:07:20.377167 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b99b8078e54663499906d14724c7f4b05d6f9dafc1bed77a1145dd8505eecc6"} err="failed to get container status \"0b99b8078e54663499906d14724c7f4b05d6f9dafc1bed77a1145dd8505eecc6\": rpc error: code = NotFound desc = could not find container \"0b99b8078e54663499906d14724c7f4b05d6f9dafc1bed77a1145dd8505eecc6\": container with ID starting with 0b99b8078e54663499906d14724c7f4b05d6f9dafc1bed77a1145dd8505eecc6 not found: ID does not exist" Feb 17 10:07:20 crc kubenswrapper[4848]: I0217 10:07:20.377285 4848 scope.go:117] "RemoveContainer" containerID="d59cc8ab40d0fbc80af224b2b9bb981645926da2113ea48204fc2bc6a75f94e8" Feb 17 10:07:20 crc kubenswrapper[4848]: E0217 10:07:20.380483 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d59cc8ab40d0fbc80af224b2b9bb981645926da2113ea48204fc2bc6a75f94e8\": container with ID starting with d59cc8ab40d0fbc80af224b2b9bb981645926da2113ea48204fc2bc6a75f94e8 not found: ID does not exist" containerID="d59cc8ab40d0fbc80af224b2b9bb981645926da2113ea48204fc2bc6a75f94e8" Feb 17 10:07:20 crc kubenswrapper[4848]: I0217 10:07:20.380528 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d59cc8ab40d0fbc80af224b2b9bb981645926da2113ea48204fc2bc6a75f94e8"} err="failed to get container status \"d59cc8ab40d0fbc80af224b2b9bb981645926da2113ea48204fc2bc6a75f94e8\": rpc error: code = NotFound desc = could not find container \"d59cc8ab40d0fbc80af224b2b9bb981645926da2113ea48204fc2bc6a75f94e8\": container with ID starting with d59cc8ab40d0fbc80af224b2b9bb981645926da2113ea48204fc2bc6a75f94e8 not found: ID does not exist" Feb 17 10:07:20 crc kubenswrapper[4848]: I0217 10:07:20.380554 4848 scope.go:117] "RemoveContainer" containerID="355034d7dfe95f22e80405c49b71e09f61ddd35b71ea3c4f1a694345386016d3" Feb 17 10:07:20 crc kubenswrapper[4848]: E0217 10:07:20.380830 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"355034d7dfe95f22e80405c49b71e09f61ddd35b71ea3c4f1a694345386016d3\": container with ID starting with 355034d7dfe95f22e80405c49b71e09f61ddd35b71ea3c4f1a694345386016d3 not found: ID does not exist" containerID="355034d7dfe95f22e80405c49b71e09f61ddd35b71ea3c4f1a694345386016d3" Feb 17 10:07:20 crc kubenswrapper[4848]: I0217 10:07:20.380847 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"355034d7dfe95f22e80405c49b71e09f61ddd35b71ea3c4f1a694345386016d3"} err="failed to get container status \"355034d7dfe95f22e80405c49b71e09f61ddd35b71ea3c4f1a694345386016d3\": rpc error: code = NotFound desc = could not find container \"355034d7dfe95f22e80405c49b71e09f61ddd35b71ea3c4f1a694345386016d3\": container with ID starting with 355034d7dfe95f22e80405c49b71e09f61ddd35b71ea3c4f1a694345386016d3 not found: ID does not exist" Feb 17 10:07:20 crc kubenswrapper[4848]: I0217 10:07:20.423292 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b5fc69cf-e25e-4ca3-adc3-36b1678691e1/setup-container/0.log" Feb 17 10:07:20 crc kubenswrapper[4848]: I0217 10:07:20.448459 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_30e48298-cbbd-4637-83a9-733efaaf0756/setup-container/0.log" Feb 17 10:07:20 crc kubenswrapper[4848]: I0217 10:07:20.512832 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_b5fc69cf-e25e-4ca3-adc3-36b1678691e1/rabbitmq/0.log" Feb 17 10:07:20 crc kubenswrapper[4848]: I0217 10:07:20.713897 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_30e48298-cbbd-4637-83a9-733efaaf0756/rabbitmq/0.log" Feb 17 10:07:20 crc kubenswrapper[4848]: I0217 10:07:20.740641 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_30e48298-cbbd-4637-83a9-733efaaf0756/setup-container/0.log" Feb 17 10:07:20 crc kubenswrapper[4848]: I0217 10:07:20.768299 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-9vsm6_3230b202-405b-4545-b04f-8c01231f565e/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 10:07:20 crc kubenswrapper[4848]: I0217 10:07:20.951122 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-hm9jz_9762dbd7-6ed8-433c-a176-402586491e40/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 10:07:21 crc kubenswrapper[4848]: I0217 10:07:21.056706 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-7hz29_6dc6526d-a2c1-40d1-a503-71c4315cc00c/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 10:07:21 crc kubenswrapper[4848]: I0217 10:07:21.254136 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-2vff5_14190cb8-1489-4fb2-8c06-0eb40f1f584e/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 10:07:21 crc kubenswrapper[4848]: I0217 10:07:21.276622 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-6gvlr_9042f387-8534-4c6e-a64a-08154984ff7d/ssh-known-hosts-edpm-deployment/0.log" Feb 17 10:07:21 crc kubenswrapper[4848]: I0217 10:07:21.393919 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4db3fa5-1874-4e1f-8db3-0507e8610158" path="/var/lib/kubelet/pods/e4db3fa5-1874-4e1f-8db3-0507e8610158/volumes" Feb 17 10:07:21 crc kubenswrapper[4848]: I0217 10:07:21.676200 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-f467db6f-7x6cx_abcdb3d8-da38-472a-bdb3-e1615f832970/proxy-server/0.log" Feb 17 10:07:21 crc kubenswrapper[4848]: I0217 10:07:21.787666 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-f467db6f-7x6cx_abcdb3d8-da38-472a-bdb3-e1615f832970/proxy-httpd/0.log" Feb 17 10:07:21 crc kubenswrapper[4848]: I0217 10:07:21.892157 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-c6lrl_6a049c1c-b425-44cc-bde0-2e83be29d1a1/swift-ring-rebalance/0.log" Feb 17 10:07:22 crc kubenswrapper[4848]: I0217 10:07:22.028462 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/account-reaper/0.log" Feb 17 10:07:22 crc kubenswrapper[4848]: I0217 10:07:22.041862 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/account-auditor/0.log" Feb 17 10:07:22 crc kubenswrapper[4848]: I0217 10:07:22.130094 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/account-server/0.log" Feb 17 10:07:22 crc kubenswrapper[4848]: I0217 10:07:22.235522 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/account-replicator/0.log" Feb 17 10:07:22 crc kubenswrapper[4848]: I0217 10:07:22.243907 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/container-auditor/0.log" Feb 17 10:07:22 crc kubenswrapper[4848]: I0217 10:07:22.291447 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/container-replicator/0.log" Feb 17 10:07:22 crc kubenswrapper[4848]: I0217 10:07:22.394597 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/container-server/0.log" Feb 17 10:07:22 crc kubenswrapper[4848]: I0217 10:07:22.437419 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/container-updater/0.log" Feb 17 10:07:22 crc kubenswrapper[4848]: I0217 10:07:22.491899 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/object-auditor/0.log" Feb 17 10:07:22 crc kubenswrapper[4848]: I0217 10:07:22.504203 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/object-expirer/0.log" Feb 17 10:07:22 crc kubenswrapper[4848]: I0217 10:07:22.605217 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/object-replicator/0.log" Feb 17 10:07:22 crc kubenswrapper[4848]: I0217 10:07:22.642774 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/object-server/0.log" Feb 17 10:07:22 crc kubenswrapper[4848]: I0217 10:07:22.677113 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/object-updater/0.log" Feb 17 10:07:22 crc kubenswrapper[4848]: I0217 10:07:22.740461 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/rsync/0.log" Feb 17 10:07:22 crc kubenswrapper[4848]: I0217 10:07:22.902038 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_5bc15802-6def-48fe-8fd5-e6d85d068827/swift-recon-cron/0.log" Feb 17 10:07:23 crc kubenswrapper[4848]: I0217 10:07:23.050593 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-x4455_78d70d99-629a-4211-9ead-66a16b766326/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 10:07:23 crc kubenswrapper[4848]: I0217 10:07:23.168633 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_c3f9f72f-9857-440a-a108-80d6b424f2a3/test-operator-logs-container/0.log" Feb 17 10:07:23 crc kubenswrapper[4848]: I0217 10:07:23.186313 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_d195d00b-8819-4a35-9e3c-6b4b21660400/tempest-tests-tempest-tests-runner/0.log" Feb 17 10:07:23 crc kubenswrapper[4848]: I0217 10:07:23.376969 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-m6tcd_31a5681d-60a0-455a-af52-e43f66fb1e93/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 17 10:07:31 crc kubenswrapper[4848]: I0217 10:07:31.760905 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_c60672b9-d590-48a6-80c0-e3f74547b5c2/memcached/0.log" Feb 17 10:07:48 crc kubenswrapper[4848]: I0217 10:07:48.225131 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw_0eda6969-cb8d-4b90-84a0-606b61156a05/util/0.log" Feb 17 10:07:48 crc kubenswrapper[4848]: I0217 10:07:48.418997 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw_0eda6969-cb8d-4b90-84a0-606b61156a05/util/0.log" Feb 17 10:07:48 crc kubenswrapper[4848]: I0217 10:07:48.453229 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw_0eda6969-cb8d-4b90-84a0-606b61156a05/pull/0.log" Feb 17 10:07:48 crc kubenswrapper[4848]: I0217 10:07:48.495567 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw_0eda6969-cb8d-4b90-84a0-606b61156a05/pull/0.log" Feb 17 10:07:48 crc kubenswrapper[4848]: I0217 10:07:48.689064 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw_0eda6969-cb8d-4b90-84a0-606b61156a05/extract/0.log" Feb 17 10:07:48 crc kubenswrapper[4848]: I0217 10:07:48.704974 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw_0eda6969-cb8d-4b90-84a0-606b61156a05/pull/0.log" Feb 17 10:07:48 crc kubenswrapper[4848]: I0217 10:07:48.723706 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_4bf6be8fe88744fb8c7a45482d50861896e90ebf8f05f0c089b9c27c21pdwtw_0eda6969-cb8d-4b90-84a0-606b61156a05/util/0.log" Feb 17 10:07:49 crc kubenswrapper[4848]: I0217 10:07:49.141821 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-fbwmm_05876a75-9b3e-45b7-a3fe-89ab569742fd/manager/0.log" Feb 17 10:07:49 crc kubenswrapper[4848]: I0217 10:07:49.474448 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987464f4-958lw_17a4dcbd-4735-48d6-a575-f7d3af6843f1/manager/0.log" Feb 17 10:07:49 crc kubenswrapper[4848]: I0217 10:07:49.550373 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-86z5g_32c32c38-9ebf-4e9a-bea8-e761159dda5f/manager/0.log" Feb 17 10:07:50 crc kubenswrapper[4848]: I0217 10:07:50.024266 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-5lpnr_ce2d3288-2b7d-4db8-861d-0a413fc90222/manager/0.log" Feb 17 10:07:50 crc kubenswrapper[4848]: I0217 10:07:50.145160 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-5d946d989d-wlll6_b2e407ed-c962-4fcf-b367-f4164d644de6/manager/0.log" Feb 17 10:07:50 crc kubenswrapper[4848]: I0217 10:07:50.460903 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-sw5bf_04f9fe37-de58-4b62-896e-0945a7bcbfdf/manager/0.log" Feb 17 10:07:50 crc kubenswrapper[4848]: I0217 10:07:50.620261 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-dlmq4_a2de98b6-28a9-446d-bc9b-ac7aad58be7d/manager/0.log" Feb 17 10:07:50 crc kubenswrapper[4848]: I0217 10:07:50.745986 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-9wvc8_97430748-300a-434e-a6b3-52274422ab66/manager/0.log" Feb 17 10:07:50 crc kubenswrapper[4848]: I0217 10:07:50.927502 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-54f6768c69-qd7ds_e6251943-952f-4cbc-924c-b362d9f7c8da/manager/0.log" Feb 17 10:07:51 crc kubenswrapper[4848]: I0217 10:07:51.195463 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-lth6q_f151e0ea-ac05-426d-aa94-e32cc25fdc09/manager/0.log" Feb 17 10:07:51 crc kubenswrapper[4848]: I0217 10:07:51.298930 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64ddbf8bb-fkvjd_aa45cd11-5d86-47c3-b46e-15c0b204feb6/manager/0.log" Feb 17 10:07:51 crc kubenswrapper[4848]: I0217 10:07:51.565597 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-rvzql_88153939-7ca7-448d-a21c-b8330360b5a1/manager/0.log" Feb 17 10:07:51 crc kubenswrapper[4848]: I0217 10:07:51.816379 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-5f8cd6b89b8npbr_f658d1a9-916e-41c9-8268-e94c22c6a045/manager/0.log" Feb 17 10:07:52 crc kubenswrapper[4848]: I0217 10:07:52.296003 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-7f8db498b4-ps9gz_4fa72ae4-db9b-4093-92bf-7aeb4ab8b7ab/operator/0.log" Feb 17 10:07:52 crc kubenswrapper[4848]: I0217 10:07:52.630933 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-b2s6b_7b87255a-321f-4b26-bc23-a7d5aeff53e2/registry-server/0.log" Feb 17 10:07:52 crc kubenswrapper[4848]: I0217 10:07:52.886481 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-d44cf6b75-ckggs_653da755-b43b-4da9-bfd9-e8ee0bb44cc4/manager/0.log" Feb 17 10:07:53 crc kubenswrapper[4848]: I0217 10:07:53.109977 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-ckxfq_3922bb1d-9f36-4ffc-b382-a54c1c213008/manager/0.log" Feb 17 10:07:53 crc kubenswrapper[4848]: I0217 10:07:53.261199 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-69f8888797-mcnbc_aebf2b53-aeb7-44a1-a2c2-3dfaa0e01bdd/manager/0.log" Feb 17 10:07:53 crc kubenswrapper[4848]: I0217 10:07:53.350282 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-hhb2h_a150a634-4cfd-4d77-ada7-5ab1f65a8985/operator/0.log" Feb 17 10:07:53 crc kubenswrapper[4848]: I0217 10:07:53.515392 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-mltd5_6b8d9b10-d577-4621-88d4-6f26e692a502/manager/0.log" Feb 17 10:07:53 crc kubenswrapper[4848]: I0217 10:07:53.801652 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-7f45b4ff68-jcd2x_bced4dcb-9bee-42a5-9e52-e6ddc83f8f06/manager/0.log" Feb 17 10:07:54 crc kubenswrapper[4848]: I0217 10:07:54.024792 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7866795846-4phd8_632a67ba-e5ae-43bb-a69e-49cf64c054e4/manager/0.log" Feb 17 10:07:54 crc kubenswrapper[4848]: I0217 10:07:54.154702 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-74d597bfd6-4ptpp_32b36aa1-7151-443d-9091-bc1e8ea86805/manager/0.log" Feb 17 10:07:54 crc kubenswrapper[4848]: I0217 10:07:54.165303 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-5db88f68c-7swkp_e72f9717-510f-4f9e-8557-ccd69b4dc61c/manager/0.log" Feb 17 10:07:55 crc kubenswrapper[4848]: I0217 10:07:55.695615 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-vqb7z_cf85b89f-2556-4ee7-a12b-6a4379f962e9/manager/0.log" Feb 17 10:08:15 crc kubenswrapper[4848]: I0217 10:08:15.552320 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-4cw7d_f7b4b17a-ac26-4e3a-8f67-d0c855b6aa95/control-plane-machine-set-operator/0.log" Feb 17 10:08:15 crc kubenswrapper[4848]: I0217 10:08:15.647488 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fqhrm_1ebf9d1e-e313-440d-992a-9e0ede5b2b24/kube-rbac-proxy/0.log" Feb 17 10:08:15 crc kubenswrapper[4848]: I0217 10:08:15.723809 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-fqhrm_1ebf9d1e-e313-440d-992a-9e0ede5b2b24/machine-api-operator/0.log" Feb 17 10:08:18 crc kubenswrapper[4848]: I0217 10:08:18.771496 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 10:08:18 crc kubenswrapper[4848]: I0217 10:08:18.771820 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 10:08:29 crc kubenswrapper[4848]: I0217 10:08:29.456517 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-whbhm_08f85cef-d7cc-46c2-a1ff-ba22a9b098ab/cert-manager-controller/0.log" Feb 17 10:08:29 crc kubenswrapper[4848]: I0217 10:08:29.610449 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-8pgtk_4851dc21-51d9-4c87-a15c-4b7295155016/cert-manager-cainjector/0.log" Feb 17 10:08:29 crc kubenswrapper[4848]: I0217 10:08:29.648628 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-crrm4_f0048a92-b3fb-4c29-b58f-7013b68e1512/cert-manager-webhook/0.log" Feb 17 10:08:42 crc kubenswrapper[4848]: I0217 10:08:42.474856 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5c78fc5d65-x4n7r_413a0360-d8d4-427d-adbc-3d7914e54ea5/nmstate-console-plugin/0.log" Feb 17 10:08:42 crc kubenswrapper[4848]: I0217 10:08:42.604873 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-6fjkq_473852b6-e35d-4b9f-8b47-e55ccb774b93/nmstate-handler/0.log" Feb 17 10:08:42 crc kubenswrapper[4848]: I0217 10:08:42.650064 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-7txp2_989f6a1e-38ab-40a8-94aa-faadc620efca/kube-rbac-proxy/0.log" Feb 17 10:08:42 crc kubenswrapper[4848]: I0217 10:08:42.797742 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58c85c668d-7txp2_989f6a1e-38ab-40a8-94aa-faadc620efca/nmstate-metrics/0.log" Feb 17 10:08:42 crc kubenswrapper[4848]: I0217 10:08:42.822056 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-694c9596b7-swpl6_c406a2fd-a4a9-47fb-bfff-80324dae94c4/nmstate-operator/0.log" Feb 17 10:08:43 crc kubenswrapper[4848]: I0217 10:08:43.000063 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-866bcb46dc-pl898_08ae32ff-43fc-4536-b68a-45e4fd947a2d/nmstate-webhook/0.log" Feb 17 10:08:48 crc kubenswrapper[4848]: I0217 10:08:48.772244 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 10:08:48 crc kubenswrapper[4848]: I0217 10:08:48.772896 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 10:09:10 crc kubenswrapper[4848]: I0217 10:09:10.661411 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-7cbd2_8ff272d7-4c99-464c-819d-b7b22fc8be06/kube-rbac-proxy/0.log" Feb 17 10:09:10 crc kubenswrapper[4848]: I0217 10:09:10.769399 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-69bbfbf88f-7cbd2_8ff272d7-4c99-464c-819d-b7b22fc8be06/controller/0.log" Feb 17 10:09:10 crc kubenswrapper[4848]: I0217 10:09:10.861950 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/cp-frr-files/0.log" Feb 17 10:09:11 crc kubenswrapper[4848]: I0217 10:09:11.053003 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/cp-reloader/0.log" Feb 17 10:09:11 crc kubenswrapper[4848]: I0217 10:09:11.067656 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/cp-frr-files/0.log" Feb 17 10:09:11 crc kubenswrapper[4848]: I0217 10:09:11.095955 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/cp-reloader/0.log" Feb 17 10:09:11 crc kubenswrapper[4848]: I0217 10:09:11.096344 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/cp-metrics/0.log" Feb 17 10:09:11 crc kubenswrapper[4848]: I0217 10:09:11.279824 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/cp-frr-files/0.log" Feb 17 10:09:11 crc kubenswrapper[4848]: I0217 10:09:11.286502 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/cp-reloader/0.log" Feb 17 10:09:11 crc kubenswrapper[4848]: I0217 10:09:11.301715 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/cp-metrics/0.log" Feb 17 10:09:11 crc kubenswrapper[4848]: I0217 10:09:11.340465 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/cp-metrics/0.log" Feb 17 10:09:11 crc kubenswrapper[4848]: I0217 10:09:11.510610 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/cp-metrics/0.log" Feb 17 10:09:11 crc kubenswrapper[4848]: I0217 10:09:11.510637 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/cp-frr-files/0.log" Feb 17 10:09:11 crc kubenswrapper[4848]: I0217 10:09:11.511074 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/cp-reloader/0.log" Feb 17 10:09:11 crc kubenswrapper[4848]: I0217 10:09:11.527227 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/controller/0.log" Feb 17 10:09:11 crc kubenswrapper[4848]: I0217 10:09:11.704708 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/frr-metrics/0.log" Feb 17 10:09:11 crc kubenswrapper[4848]: I0217 10:09:11.723492 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/kube-rbac-proxy/0.log" Feb 17 10:09:11 crc kubenswrapper[4848]: I0217 10:09:11.750290 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/kube-rbac-proxy-frr/0.log" Feb 17 10:09:11 crc kubenswrapper[4848]: I0217 10:09:11.914170 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/reloader/0.log" Feb 17 10:09:12 crc kubenswrapper[4848]: I0217 10:09:12.030415 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-78b44bf5bb-5l7zm_c229235f-b879-43bc-9b19-b4196264d1ec/frr-k8s-webhook-server/0.log" Feb 17 10:09:12 crc kubenswrapper[4848]: I0217 10:09:12.183216 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6df4786bd-895gn_054a38ba-b80d-44df-b84a-e5e3b9847df3/manager/0.log" Feb 17 10:09:12 crc kubenswrapper[4848]: I0217 10:09:12.417595 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7f987958c8-pm7pp_ccc28183-efb5-4673-8268-44ed1ced4cb7/webhook-server/0.log" Feb 17 10:09:12 crc kubenswrapper[4848]: I0217 10:09:12.435234 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lmz9q_36953889-0f59-4c5e-a666-c80389e18bf8/kube-rbac-proxy/0.log" Feb 17 10:09:13 crc kubenswrapper[4848]: I0217 10:09:13.051356 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-lmz9q_36953889-0f59-4c5e-a666-c80389e18bf8/speaker/0.log" Feb 17 10:09:13 crc kubenswrapper[4848]: I0217 10:09:13.055023 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-hrx8s_fe3f8c76-b77b-410c-830a-24fb19a0de6a/frr/0.log" Feb 17 10:09:18 crc kubenswrapper[4848]: I0217 10:09:18.771715 4848 patch_prober.go:28] interesting pod/machine-config-daemon-stvnz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 17 10:09:18 crc kubenswrapper[4848]: I0217 10:09:18.772368 4848 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 17 10:09:18 crc kubenswrapper[4848]: I0217 10:09:18.772423 4848 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" Feb 17 10:09:18 crc kubenswrapper[4848]: I0217 10:09:18.773186 4848 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab"} pod="openshift-machine-config-operator/machine-config-daemon-stvnz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 17 10:09:18 crc kubenswrapper[4848]: I0217 10:09:18.773274 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerName="machine-config-daemon" containerID="cri-o://c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" gracePeriod=600 Feb 17 10:09:19 crc kubenswrapper[4848]: E0217 10:09:19.404398 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:09:19 crc kubenswrapper[4848]: I0217 10:09:19.548108 4848 generic.go:334] "Generic (PLEG): container finished" podID="7c28fed4-873d-42f6-ae63-03d12a425d0a" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" exitCode=0 Feb 17 10:09:19 crc kubenswrapper[4848]: I0217 10:09:19.548133 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerDied","Data":"c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab"} Feb 17 10:09:19 crc kubenswrapper[4848]: I0217 10:09:19.548193 4848 scope.go:117] "RemoveContainer" containerID="bf1fbd7c833a9345e13be53113b338550eebd27fd5bf7c6bcfec90cfb0ca3555" Feb 17 10:09:19 crc kubenswrapper[4848]: I0217 10:09:19.548807 4848 scope.go:117] "RemoveContainer" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" Feb 17 10:09:19 crc kubenswrapper[4848]: E0217 10:09:19.549118 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:09:27 crc kubenswrapper[4848]: I0217 10:09:27.105978 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb_24a91401-07a6-41bf-ad5b-fa2b8f60a52f/util/0.log" Feb 17 10:09:27 crc kubenswrapper[4848]: I0217 10:09:27.351673 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb_24a91401-07a6-41bf-ad5b-fa2b8f60a52f/pull/0.log" Feb 17 10:09:27 crc kubenswrapper[4848]: I0217 10:09:27.355923 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb_24a91401-07a6-41bf-ad5b-fa2b8f60a52f/util/0.log" Feb 17 10:09:27 crc kubenswrapper[4848]: I0217 10:09:27.368978 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb_24a91401-07a6-41bf-ad5b-fa2b8f60a52f/pull/0.log" Feb 17 10:09:27 crc kubenswrapper[4848]: I0217 10:09:27.516068 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb_24a91401-07a6-41bf-ad5b-fa2b8f60a52f/util/0.log" Feb 17 10:09:27 crc kubenswrapper[4848]: I0217 10:09:27.517189 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb_24a91401-07a6-41bf-ad5b-fa2b8f60a52f/pull/0.log" Feb 17 10:09:27 crc kubenswrapper[4848]: I0217 10:09:27.584823 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a9b3ed1fe9273b725119dcfb777257f08e39bbefccdf592dce2d0dc213q8tsb_24a91401-07a6-41bf-ad5b-fa2b8f60a52f/extract/0.log" Feb 17 10:09:27 crc kubenswrapper[4848]: I0217 10:09:27.722543 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-thxdc_42e5e859-f703-486e-9d9b-06660c9d3e51/extract-utilities/0.log" Feb 17 10:09:27 crc kubenswrapper[4848]: I0217 10:09:27.900574 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-thxdc_42e5e859-f703-486e-9d9b-06660c9d3e51/extract-utilities/0.log" Feb 17 10:09:27 crc kubenswrapper[4848]: I0217 10:09:27.904234 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-thxdc_42e5e859-f703-486e-9d9b-06660c9d3e51/extract-content/0.log" Feb 17 10:09:27 crc kubenswrapper[4848]: I0217 10:09:27.908879 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-thxdc_42e5e859-f703-486e-9d9b-06660c9d3e51/extract-content/0.log" Feb 17 10:09:28 crc kubenswrapper[4848]: I0217 10:09:28.087591 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-thxdc_42e5e859-f703-486e-9d9b-06660c9d3e51/extract-utilities/0.log" Feb 17 10:09:28 crc kubenswrapper[4848]: I0217 10:09:28.133800 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-thxdc_42e5e859-f703-486e-9d9b-06660c9d3e51/extract-content/0.log" Feb 17 10:09:28 crc kubenswrapper[4848]: I0217 10:09:28.345537 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d4ccz_f36cf8f9-f4fd-456f-80ce-c4d68f71273b/extract-utilities/0.log" Feb 17 10:09:28 crc kubenswrapper[4848]: I0217 10:09:28.550013 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d4ccz_f36cf8f9-f4fd-456f-80ce-c4d68f71273b/extract-utilities/0.log" Feb 17 10:09:28 crc kubenswrapper[4848]: I0217 10:09:28.556667 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-thxdc_42e5e859-f703-486e-9d9b-06660c9d3e51/registry-server/0.log" Feb 17 10:09:28 crc kubenswrapper[4848]: I0217 10:09:28.609572 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d4ccz_f36cf8f9-f4fd-456f-80ce-c4d68f71273b/extract-content/0.log" Feb 17 10:09:28 crc kubenswrapper[4848]: I0217 10:09:28.633087 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d4ccz_f36cf8f9-f4fd-456f-80ce-c4d68f71273b/extract-content/0.log" Feb 17 10:09:28 crc kubenswrapper[4848]: I0217 10:09:28.703888 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d4ccz_f36cf8f9-f4fd-456f-80ce-c4d68f71273b/extract-utilities/0.log" Feb 17 10:09:28 crc kubenswrapper[4848]: I0217 10:09:28.752230 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d4ccz_f36cf8f9-f4fd-456f-80ce-c4d68f71273b/extract-content/0.log" Feb 17 10:09:28 crc kubenswrapper[4848]: I0217 10:09:28.885743 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28_8eb1e57e-4c70-41bf-a650-989b432ce3b6/util/0.log" Feb 17 10:09:29 crc kubenswrapper[4848]: I0217 10:09:29.162204 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28_8eb1e57e-4c70-41bf-a650-989b432ce3b6/util/0.log" Feb 17 10:09:29 crc kubenswrapper[4848]: I0217 10:09:29.195984 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28_8eb1e57e-4c70-41bf-a650-989b432ce3b6/pull/0.log" Feb 17 10:09:29 crc kubenswrapper[4848]: I0217 10:09:29.234979 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28_8eb1e57e-4c70-41bf-a650-989b432ce3b6/pull/0.log" Feb 17 10:09:29 crc kubenswrapper[4848]: I0217 10:09:29.288771 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-d4ccz_f36cf8f9-f4fd-456f-80ce-c4d68f71273b/registry-server/0.log" Feb 17 10:09:29 crc kubenswrapper[4848]: I0217 10:09:29.372932 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28_8eb1e57e-4c70-41bf-a650-989b432ce3b6/util/0.log" Feb 17 10:09:29 crc kubenswrapper[4848]: I0217 10:09:29.375065 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28_8eb1e57e-4c70-41bf-a650-989b432ce3b6/pull/0.log" Feb 17 10:09:29 crc kubenswrapper[4848]: I0217 10:09:29.439960 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f938df2ce267491f058ea7e3036e97ee3f65bf3665185b1a4f52323ecavkt28_8eb1e57e-4c70-41bf-a650-989b432ce3b6/extract/0.log" Feb 17 10:09:29 crc kubenswrapper[4848]: I0217 10:09:29.564728 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-xvh8l_cd7df0ba-ff8c-48ce-ad07-8ac50003f318/marketplace-operator/0.log" Feb 17 10:09:29 crc kubenswrapper[4848]: I0217 10:09:29.652873 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnl45_dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01/extract-utilities/0.log" Feb 17 10:09:29 crc kubenswrapper[4848]: I0217 10:09:29.800087 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnl45_dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01/extract-content/0.log" Feb 17 10:09:29 crc kubenswrapper[4848]: I0217 10:09:29.821019 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnl45_dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01/extract-utilities/0.log" Feb 17 10:09:29 crc kubenswrapper[4848]: I0217 10:09:29.831133 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnl45_dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01/extract-content/0.log" Feb 17 10:09:29 crc kubenswrapper[4848]: I0217 10:09:29.984460 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnl45_dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01/extract-utilities/0.log" Feb 17 10:09:29 crc kubenswrapper[4848]: I0217 10:09:29.993776 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnl45_dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01/extract-content/0.log" Feb 17 10:09:30 crc kubenswrapper[4848]: I0217 10:09:30.153232 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-vnl45_dc54e6a7-6fc7-4ca3-8e51-d17a8cf24a01/registry-server/0.log" Feb 17 10:09:30 crc kubenswrapper[4848]: I0217 10:09:30.210910 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m54wv_31b2644d-739b-4457-a3cc-c30d6b116423/extract-utilities/0.log" Feb 17 10:09:30 crc kubenswrapper[4848]: I0217 10:09:30.323202 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m54wv_31b2644d-739b-4457-a3cc-c30d6b116423/extract-content/0.log" Feb 17 10:09:30 crc kubenswrapper[4848]: I0217 10:09:30.341579 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m54wv_31b2644d-739b-4457-a3cc-c30d6b116423/extract-content/0.log" Feb 17 10:09:30 crc kubenswrapper[4848]: I0217 10:09:30.365697 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m54wv_31b2644d-739b-4457-a3cc-c30d6b116423/extract-utilities/0.log" Feb 17 10:09:30 crc kubenswrapper[4848]: I0217 10:09:30.520310 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m54wv_31b2644d-739b-4457-a3cc-c30d6b116423/extract-content/0.log" Feb 17 10:09:30 crc kubenswrapper[4848]: I0217 10:09:30.550250 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m54wv_31b2644d-739b-4457-a3cc-c30d6b116423/extract-utilities/0.log" Feb 17 10:09:31 crc kubenswrapper[4848]: I0217 10:09:31.066087 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-m54wv_31b2644d-739b-4457-a3cc-c30d6b116423/registry-server/0.log" Feb 17 10:09:31 crc kubenswrapper[4848]: I0217 10:09:31.384173 4848 scope.go:117] "RemoveContainer" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" Feb 17 10:09:31 crc kubenswrapper[4848]: E0217 10:09:31.384452 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:09:42 crc kubenswrapper[4848]: I0217 10:09:42.383775 4848 scope.go:117] "RemoveContainer" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" Feb 17 10:09:42 crc kubenswrapper[4848]: E0217 10:09:42.384749 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:09:48 crc kubenswrapper[4848]: E0217 10:09:48.362530 4848 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.69:56898->38.102.83.69:38227: write tcp 38.102.83.69:56898->38.102.83.69:38227: write: broken pipe Feb 17 10:09:55 crc kubenswrapper[4848]: I0217 10:09:55.383663 4848 scope.go:117] "RemoveContainer" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" Feb 17 10:09:55 crc kubenswrapper[4848]: E0217 10:09:55.384541 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:10:06 crc kubenswrapper[4848]: I0217 10:10:06.383374 4848 scope.go:117] "RemoveContainer" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" Feb 17 10:10:06 crc kubenswrapper[4848]: E0217 10:10:06.386076 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:10:12 crc kubenswrapper[4848]: I0217 10:10:12.919248 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f5dw9"] Feb 17 10:10:12 crc kubenswrapper[4848]: E0217 10:10:12.920555 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4db3fa5-1874-4e1f-8db3-0507e8610158" containerName="registry-server" Feb 17 10:10:12 crc kubenswrapper[4848]: I0217 10:10:12.920574 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4db3fa5-1874-4e1f-8db3-0507e8610158" containerName="registry-server" Feb 17 10:10:12 crc kubenswrapper[4848]: E0217 10:10:12.920605 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4db3fa5-1874-4e1f-8db3-0507e8610158" containerName="extract-utilities" Feb 17 10:10:12 crc kubenswrapper[4848]: I0217 10:10:12.920624 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4db3fa5-1874-4e1f-8db3-0507e8610158" containerName="extract-utilities" Feb 17 10:10:12 crc kubenswrapper[4848]: E0217 10:10:12.920644 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4db3fa5-1874-4e1f-8db3-0507e8610158" containerName="extract-content" Feb 17 10:10:12 crc kubenswrapper[4848]: I0217 10:10:12.920654 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4db3fa5-1874-4e1f-8db3-0507e8610158" containerName="extract-content" Feb 17 10:10:12 crc kubenswrapper[4848]: I0217 10:10:12.921199 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4db3fa5-1874-4e1f-8db3-0507e8610158" containerName="registry-server" Feb 17 10:10:12 crc kubenswrapper[4848]: I0217 10:10:12.922637 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f5dw9" Feb 17 10:10:12 crc kubenswrapper[4848]: I0217 10:10:12.940776 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f5dw9"] Feb 17 10:10:12 crc kubenswrapper[4848]: I0217 10:10:12.949927 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aa7423b-4eda-491c-9668-6d28f73c1d86-utilities\") pod \"redhat-operators-f5dw9\" (UID: \"7aa7423b-4eda-491c-9668-6d28f73c1d86\") " pod="openshift-marketplace/redhat-operators-f5dw9" Feb 17 10:10:12 crc kubenswrapper[4848]: I0217 10:10:12.950239 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqf2p\" (UniqueName: \"kubernetes.io/projected/7aa7423b-4eda-491c-9668-6d28f73c1d86-kube-api-access-kqf2p\") pod \"redhat-operators-f5dw9\" (UID: \"7aa7423b-4eda-491c-9668-6d28f73c1d86\") " pod="openshift-marketplace/redhat-operators-f5dw9" Feb 17 10:10:12 crc kubenswrapper[4848]: I0217 10:10:12.950694 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aa7423b-4eda-491c-9668-6d28f73c1d86-catalog-content\") pod \"redhat-operators-f5dw9\" (UID: \"7aa7423b-4eda-491c-9668-6d28f73c1d86\") " pod="openshift-marketplace/redhat-operators-f5dw9" Feb 17 10:10:13 crc kubenswrapper[4848]: I0217 10:10:13.053108 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aa7423b-4eda-491c-9668-6d28f73c1d86-catalog-content\") pod \"redhat-operators-f5dw9\" (UID: \"7aa7423b-4eda-491c-9668-6d28f73c1d86\") " pod="openshift-marketplace/redhat-operators-f5dw9" Feb 17 10:10:13 crc kubenswrapper[4848]: I0217 10:10:13.053182 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aa7423b-4eda-491c-9668-6d28f73c1d86-utilities\") pod \"redhat-operators-f5dw9\" (UID: \"7aa7423b-4eda-491c-9668-6d28f73c1d86\") " pod="openshift-marketplace/redhat-operators-f5dw9" Feb 17 10:10:13 crc kubenswrapper[4848]: I0217 10:10:13.053283 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqf2p\" (UniqueName: \"kubernetes.io/projected/7aa7423b-4eda-491c-9668-6d28f73c1d86-kube-api-access-kqf2p\") pod \"redhat-operators-f5dw9\" (UID: \"7aa7423b-4eda-491c-9668-6d28f73c1d86\") " pod="openshift-marketplace/redhat-operators-f5dw9" Feb 17 10:10:13 crc kubenswrapper[4848]: I0217 10:10:13.053621 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aa7423b-4eda-491c-9668-6d28f73c1d86-catalog-content\") pod \"redhat-operators-f5dw9\" (UID: \"7aa7423b-4eda-491c-9668-6d28f73c1d86\") " pod="openshift-marketplace/redhat-operators-f5dw9" Feb 17 10:10:13 crc kubenswrapper[4848]: I0217 10:10:13.053739 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aa7423b-4eda-491c-9668-6d28f73c1d86-utilities\") pod \"redhat-operators-f5dw9\" (UID: \"7aa7423b-4eda-491c-9668-6d28f73c1d86\") " pod="openshift-marketplace/redhat-operators-f5dw9" Feb 17 10:10:13 crc kubenswrapper[4848]: I0217 10:10:13.081850 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqf2p\" (UniqueName: \"kubernetes.io/projected/7aa7423b-4eda-491c-9668-6d28f73c1d86-kube-api-access-kqf2p\") pod \"redhat-operators-f5dw9\" (UID: \"7aa7423b-4eda-491c-9668-6d28f73c1d86\") " pod="openshift-marketplace/redhat-operators-f5dw9" Feb 17 10:10:13 crc kubenswrapper[4848]: I0217 10:10:13.263124 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f5dw9" Feb 17 10:10:13 crc kubenswrapper[4848]: I0217 10:10:13.732061 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f5dw9"] Feb 17 10:10:14 crc kubenswrapper[4848]: I0217 10:10:14.049679 4848 generic.go:334] "Generic (PLEG): container finished" podID="7aa7423b-4eda-491c-9668-6d28f73c1d86" containerID="91eab03f35fb216fd7b2c0b523828b7d26824661151abfc6cad26addd782300a" exitCode=0 Feb 17 10:10:14 crc kubenswrapper[4848]: I0217 10:10:14.049728 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5dw9" event={"ID":"7aa7423b-4eda-491c-9668-6d28f73c1d86","Type":"ContainerDied","Data":"91eab03f35fb216fd7b2c0b523828b7d26824661151abfc6cad26addd782300a"} Feb 17 10:10:14 crc kubenswrapper[4848]: I0217 10:10:14.049791 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5dw9" event={"ID":"7aa7423b-4eda-491c-9668-6d28f73c1d86","Type":"ContainerStarted","Data":"118ee6ca881b9573b2e78944bbcce9bff698c4bef5de4e69f0a2bc78b757c125"} Feb 17 10:10:14 crc kubenswrapper[4848]: I0217 10:10:14.051446 4848 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 17 10:10:16 crc kubenswrapper[4848]: I0217 10:10:16.070881 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5dw9" event={"ID":"7aa7423b-4eda-491c-9668-6d28f73c1d86","Type":"ContainerStarted","Data":"ec2089b5c7c277c6073c183dd239539db1485d410b2538b732cb44360ce65bb6"} Feb 17 10:10:19 crc kubenswrapper[4848]: I0217 10:10:19.105226 4848 generic.go:334] "Generic (PLEG): container finished" podID="7aa7423b-4eda-491c-9668-6d28f73c1d86" containerID="ec2089b5c7c277c6073c183dd239539db1485d410b2538b732cb44360ce65bb6" exitCode=0 Feb 17 10:10:19 crc kubenswrapper[4848]: I0217 10:10:19.105301 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5dw9" event={"ID":"7aa7423b-4eda-491c-9668-6d28f73c1d86","Type":"ContainerDied","Data":"ec2089b5c7c277c6073c183dd239539db1485d410b2538b732cb44360ce65bb6"} Feb 17 10:10:20 crc kubenswrapper[4848]: I0217 10:10:20.128459 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5dw9" event={"ID":"7aa7423b-4eda-491c-9668-6d28f73c1d86","Type":"ContainerStarted","Data":"c3c51a638051be549dcb34eb426fc8ce74b76c1219d936a3cc311e6f7481dafa"} Feb 17 10:10:20 crc kubenswrapper[4848]: I0217 10:10:20.159204 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f5dw9" podStartSLOduration=2.685896866 podStartE2EDuration="8.159179476s" podCreationTimestamp="2026-02-17 10:10:12 +0000 UTC" firstStartedPulling="2026-02-17 10:10:14.051230169 +0000 UTC m=+3891.594485815" lastFinishedPulling="2026-02-17 10:10:19.524512779 +0000 UTC m=+3897.067768425" observedRunningTime="2026-02-17 10:10:20.156254303 +0000 UTC m=+3897.699509979" watchObservedRunningTime="2026-02-17 10:10:20.159179476 +0000 UTC m=+3897.702435132" Feb 17 10:10:20 crc kubenswrapper[4848]: I0217 10:10:20.384542 4848 scope.go:117] "RemoveContainer" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" Feb 17 10:10:20 crc kubenswrapper[4848]: E0217 10:10:20.384911 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:10:23 crc kubenswrapper[4848]: I0217 10:10:23.263410 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f5dw9" Feb 17 10:10:23 crc kubenswrapper[4848]: I0217 10:10:23.266335 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f5dw9" Feb 17 10:10:24 crc kubenswrapper[4848]: I0217 10:10:24.335851 4848 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f5dw9" podUID="7aa7423b-4eda-491c-9668-6d28f73c1d86" containerName="registry-server" probeResult="failure" output=< Feb 17 10:10:24 crc kubenswrapper[4848]: timeout: failed to connect service ":50051" within 1s Feb 17 10:10:24 crc kubenswrapper[4848]: > Feb 17 10:10:33 crc kubenswrapper[4848]: I0217 10:10:33.325832 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f5dw9" Feb 17 10:10:33 crc kubenswrapper[4848]: I0217 10:10:33.399068 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f5dw9" Feb 17 10:10:33 crc kubenswrapper[4848]: I0217 10:10:33.575889 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f5dw9"] Feb 17 10:10:34 crc kubenswrapper[4848]: I0217 10:10:34.383297 4848 scope.go:117] "RemoveContainer" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" Feb 17 10:10:34 crc kubenswrapper[4848]: E0217 10:10:34.383943 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:10:35 crc kubenswrapper[4848]: I0217 10:10:35.267707 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f5dw9" podUID="7aa7423b-4eda-491c-9668-6d28f73c1d86" containerName="registry-server" containerID="cri-o://c3c51a638051be549dcb34eb426fc8ce74b76c1219d936a3cc311e6f7481dafa" gracePeriod=2 Feb 17 10:10:35 crc kubenswrapper[4848]: I0217 10:10:35.774043 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f5dw9" Feb 17 10:10:35 crc kubenswrapper[4848]: I0217 10:10:35.873835 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqf2p\" (UniqueName: \"kubernetes.io/projected/7aa7423b-4eda-491c-9668-6d28f73c1d86-kube-api-access-kqf2p\") pod \"7aa7423b-4eda-491c-9668-6d28f73c1d86\" (UID: \"7aa7423b-4eda-491c-9668-6d28f73c1d86\") " Feb 17 10:10:35 crc kubenswrapper[4848]: I0217 10:10:35.873888 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aa7423b-4eda-491c-9668-6d28f73c1d86-utilities\") pod \"7aa7423b-4eda-491c-9668-6d28f73c1d86\" (UID: \"7aa7423b-4eda-491c-9668-6d28f73c1d86\") " Feb 17 10:10:35 crc kubenswrapper[4848]: I0217 10:10:35.873927 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aa7423b-4eda-491c-9668-6d28f73c1d86-catalog-content\") pod \"7aa7423b-4eda-491c-9668-6d28f73c1d86\" (UID: \"7aa7423b-4eda-491c-9668-6d28f73c1d86\") " Feb 17 10:10:35 crc kubenswrapper[4848]: I0217 10:10:35.883863 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7aa7423b-4eda-491c-9668-6d28f73c1d86-utilities" (OuterVolumeSpecName: "utilities") pod "7aa7423b-4eda-491c-9668-6d28f73c1d86" (UID: "7aa7423b-4eda-491c-9668-6d28f73c1d86"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 10:10:35 crc kubenswrapper[4848]: I0217 10:10:35.890346 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aa7423b-4eda-491c-9668-6d28f73c1d86-kube-api-access-kqf2p" (OuterVolumeSpecName: "kube-api-access-kqf2p") pod "7aa7423b-4eda-491c-9668-6d28f73c1d86" (UID: "7aa7423b-4eda-491c-9668-6d28f73c1d86"). InnerVolumeSpecName "kube-api-access-kqf2p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 10:10:35 crc kubenswrapper[4848]: I0217 10:10:35.976079 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqf2p\" (UniqueName: \"kubernetes.io/projected/7aa7423b-4eda-491c-9668-6d28f73c1d86-kube-api-access-kqf2p\") on node \"crc\" DevicePath \"\"" Feb 17 10:10:35 crc kubenswrapper[4848]: I0217 10:10:35.976121 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7aa7423b-4eda-491c-9668-6d28f73c1d86-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 10:10:36 crc kubenswrapper[4848]: I0217 10:10:36.012380 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7aa7423b-4eda-491c-9668-6d28f73c1d86-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7aa7423b-4eda-491c-9668-6d28f73c1d86" (UID: "7aa7423b-4eda-491c-9668-6d28f73c1d86"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 10:10:36 crc kubenswrapper[4848]: I0217 10:10:36.078040 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7aa7423b-4eda-491c-9668-6d28f73c1d86-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 10:10:36 crc kubenswrapper[4848]: I0217 10:10:36.281110 4848 generic.go:334] "Generic (PLEG): container finished" podID="7aa7423b-4eda-491c-9668-6d28f73c1d86" containerID="c3c51a638051be549dcb34eb426fc8ce74b76c1219d936a3cc311e6f7481dafa" exitCode=0 Feb 17 10:10:36 crc kubenswrapper[4848]: I0217 10:10:36.281169 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5dw9" event={"ID":"7aa7423b-4eda-491c-9668-6d28f73c1d86","Type":"ContainerDied","Data":"c3c51a638051be549dcb34eb426fc8ce74b76c1219d936a3cc311e6f7481dafa"} Feb 17 10:10:36 crc kubenswrapper[4848]: I0217 10:10:36.281202 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f5dw9" event={"ID":"7aa7423b-4eda-491c-9668-6d28f73c1d86","Type":"ContainerDied","Data":"118ee6ca881b9573b2e78944bbcce9bff698c4bef5de4e69f0a2bc78b757c125"} Feb 17 10:10:36 crc kubenswrapper[4848]: I0217 10:10:36.281230 4848 scope.go:117] "RemoveContainer" containerID="c3c51a638051be549dcb34eb426fc8ce74b76c1219d936a3cc311e6f7481dafa" Feb 17 10:10:36 crc kubenswrapper[4848]: I0217 10:10:36.281392 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f5dw9" Feb 17 10:10:36 crc kubenswrapper[4848]: I0217 10:10:36.325512 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f5dw9"] Feb 17 10:10:36 crc kubenswrapper[4848]: I0217 10:10:36.332724 4848 scope.go:117] "RemoveContainer" containerID="ec2089b5c7c277c6073c183dd239539db1485d410b2538b732cb44360ce65bb6" Feb 17 10:10:36 crc kubenswrapper[4848]: I0217 10:10:36.334342 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f5dw9"] Feb 17 10:10:36 crc kubenswrapper[4848]: I0217 10:10:36.364434 4848 scope.go:117] "RemoveContainer" containerID="91eab03f35fb216fd7b2c0b523828b7d26824661151abfc6cad26addd782300a" Feb 17 10:10:36 crc kubenswrapper[4848]: I0217 10:10:36.420078 4848 scope.go:117] "RemoveContainer" containerID="c3c51a638051be549dcb34eb426fc8ce74b76c1219d936a3cc311e6f7481dafa" Feb 17 10:10:36 crc kubenswrapper[4848]: E0217 10:10:36.420872 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3c51a638051be549dcb34eb426fc8ce74b76c1219d936a3cc311e6f7481dafa\": container with ID starting with c3c51a638051be549dcb34eb426fc8ce74b76c1219d936a3cc311e6f7481dafa not found: ID does not exist" containerID="c3c51a638051be549dcb34eb426fc8ce74b76c1219d936a3cc311e6f7481dafa" Feb 17 10:10:36 crc kubenswrapper[4848]: I0217 10:10:36.420910 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3c51a638051be549dcb34eb426fc8ce74b76c1219d936a3cc311e6f7481dafa"} err="failed to get container status \"c3c51a638051be549dcb34eb426fc8ce74b76c1219d936a3cc311e6f7481dafa\": rpc error: code = NotFound desc = could not find container \"c3c51a638051be549dcb34eb426fc8ce74b76c1219d936a3cc311e6f7481dafa\": container with ID starting with c3c51a638051be549dcb34eb426fc8ce74b76c1219d936a3cc311e6f7481dafa not found: ID does not exist" Feb 17 10:10:36 crc kubenswrapper[4848]: I0217 10:10:36.420929 4848 scope.go:117] "RemoveContainer" containerID="ec2089b5c7c277c6073c183dd239539db1485d410b2538b732cb44360ce65bb6" Feb 17 10:10:36 crc kubenswrapper[4848]: E0217 10:10:36.421231 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec2089b5c7c277c6073c183dd239539db1485d410b2538b732cb44360ce65bb6\": container with ID starting with ec2089b5c7c277c6073c183dd239539db1485d410b2538b732cb44360ce65bb6 not found: ID does not exist" containerID="ec2089b5c7c277c6073c183dd239539db1485d410b2538b732cb44360ce65bb6" Feb 17 10:10:36 crc kubenswrapper[4848]: I0217 10:10:36.421299 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec2089b5c7c277c6073c183dd239539db1485d410b2538b732cb44360ce65bb6"} err="failed to get container status \"ec2089b5c7c277c6073c183dd239539db1485d410b2538b732cb44360ce65bb6\": rpc error: code = NotFound desc = could not find container \"ec2089b5c7c277c6073c183dd239539db1485d410b2538b732cb44360ce65bb6\": container with ID starting with ec2089b5c7c277c6073c183dd239539db1485d410b2538b732cb44360ce65bb6 not found: ID does not exist" Feb 17 10:10:36 crc kubenswrapper[4848]: I0217 10:10:36.421319 4848 scope.go:117] "RemoveContainer" containerID="91eab03f35fb216fd7b2c0b523828b7d26824661151abfc6cad26addd782300a" Feb 17 10:10:36 crc kubenswrapper[4848]: E0217 10:10:36.421592 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91eab03f35fb216fd7b2c0b523828b7d26824661151abfc6cad26addd782300a\": container with ID starting with 91eab03f35fb216fd7b2c0b523828b7d26824661151abfc6cad26addd782300a not found: ID does not exist" containerID="91eab03f35fb216fd7b2c0b523828b7d26824661151abfc6cad26addd782300a" Feb 17 10:10:36 crc kubenswrapper[4848]: I0217 10:10:36.421624 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91eab03f35fb216fd7b2c0b523828b7d26824661151abfc6cad26addd782300a"} err="failed to get container status \"91eab03f35fb216fd7b2c0b523828b7d26824661151abfc6cad26addd782300a\": rpc error: code = NotFound desc = could not find container \"91eab03f35fb216fd7b2c0b523828b7d26824661151abfc6cad26addd782300a\": container with ID starting with 91eab03f35fb216fd7b2c0b523828b7d26824661151abfc6cad26addd782300a not found: ID does not exist" Feb 17 10:10:37 crc kubenswrapper[4848]: I0217 10:10:37.405321 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7aa7423b-4eda-491c-9668-6d28f73c1d86" path="/var/lib/kubelet/pods/7aa7423b-4eda-491c-9668-6d28f73c1d86/volumes" Feb 17 10:10:47 crc kubenswrapper[4848]: I0217 10:10:47.383691 4848 scope.go:117] "RemoveContainer" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" Feb 17 10:10:47 crc kubenswrapper[4848]: E0217 10:10:47.384777 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:10:58 crc kubenswrapper[4848]: I0217 10:10:58.383165 4848 scope.go:117] "RemoveContainer" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" Feb 17 10:10:58 crc kubenswrapper[4848]: E0217 10:10:58.384418 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:11:10 crc kubenswrapper[4848]: I0217 10:11:10.383927 4848 scope.go:117] "RemoveContainer" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" Feb 17 10:11:10 crc kubenswrapper[4848]: E0217 10:11:10.384892 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:11:18 crc kubenswrapper[4848]: I0217 10:11:18.148917 4848 generic.go:334] "Generic (PLEG): container finished" podID="f850b836-0ebb-4173-aa4a-10deba9cfc12" containerID="ba1d7be885666b6e845e8678d0b1bbd3ca20044806e92cc4c83fa7af7e26a04a" exitCode=0 Feb 17 10:11:18 crc kubenswrapper[4848]: I0217 10:11:18.149006 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6g2kf/must-gather-86tjn" event={"ID":"f850b836-0ebb-4173-aa4a-10deba9cfc12","Type":"ContainerDied","Data":"ba1d7be885666b6e845e8678d0b1bbd3ca20044806e92cc4c83fa7af7e26a04a"} Feb 17 10:11:18 crc kubenswrapper[4848]: I0217 10:11:18.150159 4848 scope.go:117] "RemoveContainer" containerID="ba1d7be885666b6e845e8678d0b1bbd3ca20044806e92cc4c83fa7af7e26a04a" Feb 17 10:11:18 crc kubenswrapper[4848]: I0217 10:11:18.248912 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6g2kf_must-gather-86tjn_f850b836-0ebb-4173-aa4a-10deba9cfc12/gather/0.log" Feb 17 10:11:23 crc kubenswrapper[4848]: E0217 10:11:23.259524 4848 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.69:55016->38.102.83.69:38227: write tcp 38.102.83.69:55016->38.102.83.69:38227: write: broken pipe Feb 17 10:11:23 crc kubenswrapper[4848]: I0217 10:11:23.389009 4848 scope.go:117] "RemoveContainer" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" Feb 17 10:11:23 crc kubenswrapper[4848]: E0217 10:11:23.389645 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:11:29 crc kubenswrapper[4848]: I0217 10:11:29.327229 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-6g2kf/must-gather-86tjn"] Feb 17 10:11:29 crc kubenswrapper[4848]: I0217 10:11:29.328126 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-6g2kf/must-gather-86tjn" podUID="f850b836-0ebb-4173-aa4a-10deba9cfc12" containerName="copy" containerID="cri-o://437b6dbd689434194efe853ed6acba48fe7ac099da10e20493d8ca14168461b9" gracePeriod=2 Feb 17 10:11:29 crc kubenswrapper[4848]: I0217 10:11:29.336587 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-6g2kf/must-gather-86tjn"] Feb 17 10:11:29 crc kubenswrapper[4848]: I0217 10:11:29.805069 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6g2kf_must-gather-86tjn_f850b836-0ebb-4173-aa4a-10deba9cfc12/copy/0.log" Feb 17 10:11:29 crc kubenswrapper[4848]: I0217 10:11:29.805854 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6g2kf/must-gather-86tjn" Feb 17 10:11:29 crc kubenswrapper[4848]: I0217 10:11:29.874501 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fsmj\" (UniqueName: \"kubernetes.io/projected/f850b836-0ebb-4173-aa4a-10deba9cfc12-kube-api-access-9fsmj\") pod \"f850b836-0ebb-4173-aa4a-10deba9cfc12\" (UID: \"f850b836-0ebb-4173-aa4a-10deba9cfc12\") " Feb 17 10:11:29 crc kubenswrapper[4848]: I0217 10:11:29.874615 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f850b836-0ebb-4173-aa4a-10deba9cfc12-must-gather-output\") pod \"f850b836-0ebb-4173-aa4a-10deba9cfc12\" (UID: \"f850b836-0ebb-4173-aa4a-10deba9cfc12\") " Feb 17 10:11:29 crc kubenswrapper[4848]: I0217 10:11:29.885157 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f850b836-0ebb-4173-aa4a-10deba9cfc12-kube-api-access-9fsmj" (OuterVolumeSpecName: "kube-api-access-9fsmj") pod "f850b836-0ebb-4173-aa4a-10deba9cfc12" (UID: "f850b836-0ebb-4173-aa4a-10deba9cfc12"). InnerVolumeSpecName "kube-api-access-9fsmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 10:11:29 crc kubenswrapper[4848]: I0217 10:11:29.977951 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fsmj\" (UniqueName: \"kubernetes.io/projected/f850b836-0ebb-4173-aa4a-10deba9cfc12-kube-api-access-9fsmj\") on node \"crc\" DevicePath \"\"" Feb 17 10:11:30 crc kubenswrapper[4848]: I0217 10:11:30.015697 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f850b836-0ebb-4173-aa4a-10deba9cfc12-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f850b836-0ebb-4173-aa4a-10deba9cfc12" (UID: "f850b836-0ebb-4173-aa4a-10deba9cfc12"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 10:11:30 crc kubenswrapper[4848]: I0217 10:11:30.080793 4848 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f850b836-0ebb-4173-aa4a-10deba9cfc12-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 17 10:11:30 crc kubenswrapper[4848]: I0217 10:11:30.261979 4848 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-6g2kf_must-gather-86tjn_f850b836-0ebb-4173-aa4a-10deba9cfc12/copy/0.log" Feb 17 10:11:30 crc kubenswrapper[4848]: I0217 10:11:30.262704 4848 generic.go:334] "Generic (PLEG): container finished" podID="f850b836-0ebb-4173-aa4a-10deba9cfc12" containerID="437b6dbd689434194efe853ed6acba48fe7ac099da10e20493d8ca14168461b9" exitCode=143 Feb 17 10:11:30 crc kubenswrapper[4848]: I0217 10:11:30.262786 4848 scope.go:117] "RemoveContainer" containerID="437b6dbd689434194efe853ed6acba48fe7ac099da10e20493d8ca14168461b9" Feb 17 10:11:30 crc kubenswrapper[4848]: I0217 10:11:30.262790 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6g2kf/must-gather-86tjn" Feb 17 10:11:30 crc kubenswrapper[4848]: I0217 10:11:30.301300 4848 scope.go:117] "RemoveContainer" containerID="ba1d7be885666b6e845e8678d0b1bbd3ca20044806e92cc4c83fa7af7e26a04a" Feb 17 10:11:30 crc kubenswrapper[4848]: I0217 10:11:30.379358 4848 scope.go:117] "RemoveContainer" containerID="437b6dbd689434194efe853ed6acba48fe7ac099da10e20493d8ca14168461b9" Feb 17 10:11:30 crc kubenswrapper[4848]: E0217 10:11:30.379849 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"437b6dbd689434194efe853ed6acba48fe7ac099da10e20493d8ca14168461b9\": container with ID starting with 437b6dbd689434194efe853ed6acba48fe7ac099da10e20493d8ca14168461b9 not found: ID does not exist" containerID="437b6dbd689434194efe853ed6acba48fe7ac099da10e20493d8ca14168461b9" Feb 17 10:11:30 crc kubenswrapper[4848]: I0217 10:11:30.379896 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"437b6dbd689434194efe853ed6acba48fe7ac099da10e20493d8ca14168461b9"} err="failed to get container status \"437b6dbd689434194efe853ed6acba48fe7ac099da10e20493d8ca14168461b9\": rpc error: code = NotFound desc = could not find container \"437b6dbd689434194efe853ed6acba48fe7ac099da10e20493d8ca14168461b9\": container with ID starting with 437b6dbd689434194efe853ed6acba48fe7ac099da10e20493d8ca14168461b9 not found: ID does not exist" Feb 17 10:11:30 crc kubenswrapper[4848]: I0217 10:11:30.379918 4848 scope.go:117] "RemoveContainer" containerID="ba1d7be885666b6e845e8678d0b1bbd3ca20044806e92cc4c83fa7af7e26a04a" Feb 17 10:11:30 crc kubenswrapper[4848]: E0217 10:11:30.380201 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba1d7be885666b6e845e8678d0b1bbd3ca20044806e92cc4c83fa7af7e26a04a\": container with ID starting with ba1d7be885666b6e845e8678d0b1bbd3ca20044806e92cc4c83fa7af7e26a04a not found: ID does not exist" containerID="ba1d7be885666b6e845e8678d0b1bbd3ca20044806e92cc4c83fa7af7e26a04a" Feb 17 10:11:30 crc kubenswrapper[4848]: I0217 10:11:30.380217 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba1d7be885666b6e845e8678d0b1bbd3ca20044806e92cc4c83fa7af7e26a04a"} err="failed to get container status \"ba1d7be885666b6e845e8678d0b1bbd3ca20044806e92cc4c83fa7af7e26a04a\": rpc error: code = NotFound desc = could not find container \"ba1d7be885666b6e845e8678d0b1bbd3ca20044806e92cc4c83fa7af7e26a04a\": container with ID starting with ba1d7be885666b6e845e8678d0b1bbd3ca20044806e92cc4c83fa7af7e26a04a not found: ID does not exist" Feb 17 10:11:31 crc kubenswrapper[4848]: I0217 10:11:31.395897 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f850b836-0ebb-4173-aa4a-10deba9cfc12" path="/var/lib/kubelet/pods/f850b836-0ebb-4173-aa4a-10deba9cfc12/volumes" Feb 17 10:11:37 crc kubenswrapper[4848]: I0217 10:11:37.384080 4848 scope.go:117] "RemoveContainer" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" Feb 17 10:11:37 crc kubenswrapper[4848]: E0217 10:11:37.385199 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:11:49 crc kubenswrapper[4848]: I0217 10:11:49.384337 4848 scope.go:117] "RemoveContainer" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" Feb 17 10:11:49 crc kubenswrapper[4848]: E0217 10:11:49.385637 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:12:03 crc kubenswrapper[4848]: I0217 10:12:03.395306 4848 scope.go:117] "RemoveContainer" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" Feb 17 10:12:03 crc kubenswrapper[4848]: E0217 10:12:03.396699 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:12:17 crc kubenswrapper[4848]: I0217 10:12:17.383567 4848 scope.go:117] "RemoveContainer" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" Feb 17 10:12:17 crc kubenswrapper[4848]: E0217 10:12:17.384447 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:12:28 crc kubenswrapper[4848]: I0217 10:12:28.383064 4848 scope.go:117] "RemoveContainer" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" Feb 17 10:12:28 crc kubenswrapper[4848]: E0217 10:12:28.383871 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:12:43 crc kubenswrapper[4848]: I0217 10:12:43.389071 4848 scope.go:117] "RemoveContainer" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" Feb 17 10:12:43 crc kubenswrapper[4848]: E0217 10:12:43.389906 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:12:47 crc kubenswrapper[4848]: I0217 10:12:47.751754 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vhvgf"] Feb 17 10:12:47 crc kubenswrapper[4848]: E0217 10:12:47.752649 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f850b836-0ebb-4173-aa4a-10deba9cfc12" containerName="copy" Feb 17 10:12:47 crc kubenswrapper[4848]: I0217 10:12:47.752663 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f850b836-0ebb-4173-aa4a-10deba9cfc12" containerName="copy" Feb 17 10:12:47 crc kubenswrapper[4848]: E0217 10:12:47.752688 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa7423b-4eda-491c-9668-6d28f73c1d86" containerName="extract-content" Feb 17 10:12:47 crc kubenswrapper[4848]: I0217 10:12:47.752696 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa7423b-4eda-491c-9668-6d28f73c1d86" containerName="extract-content" Feb 17 10:12:47 crc kubenswrapper[4848]: E0217 10:12:47.752717 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa7423b-4eda-491c-9668-6d28f73c1d86" containerName="extract-utilities" Feb 17 10:12:47 crc kubenswrapper[4848]: I0217 10:12:47.752725 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa7423b-4eda-491c-9668-6d28f73c1d86" containerName="extract-utilities" Feb 17 10:12:47 crc kubenswrapper[4848]: E0217 10:12:47.752750 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f850b836-0ebb-4173-aa4a-10deba9cfc12" containerName="gather" Feb 17 10:12:47 crc kubenswrapper[4848]: I0217 10:12:47.752775 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="f850b836-0ebb-4173-aa4a-10deba9cfc12" containerName="gather" Feb 17 10:12:47 crc kubenswrapper[4848]: E0217 10:12:47.752789 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aa7423b-4eda-491c-9668-6d28f73c1d86" containerName="registry-server" Feb 17 10:12:47 crc kubenswrapper[4848]: I0217 10:12:47.752809 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aa7423b-4eda-491c-9668-6d28f73c1d86" containerName="registry-server" Feb 17 10:12:47 crc kubenswrapper[4848]: I0217 10:12:47.753677 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f850b836-0ebb-4173-aa4a-10deba9cfc12" containerName="copy" Feb 17 10:12:47 crc kubenswrapper[4848]: I0217 10:12:47.753731 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="f850b836-0ebb-4173-aa4a-10deba9cfc12" containerName="gather" Feb 17 10:12:47 crc kubenswrapper[4848]: I0217 10:12:47.753744 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aa7423b-4eda-491c-9668-6d28f73c1d86" containerName="registry-server" Feb 17 10:12:47 crc kubenswrapper[4848]: I0217 10:12:47.756146 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vhvgf" Feb 17 10:12:47 crc kubenswrapper[4848]: I0217 10:12:47.806821 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vhvgf"] Feb 17 10:12:47 crc kubenswrapper[4848]: I0217 10:12:47.822080 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f633976-d8e1-4f15-8c2f-5b08f3391017-catalog-content\") pod \"community-operators-vhvgf\" (UID: \"6f633976-d8e1-4f15-8c2f-5b08f3391017\") " pod="openshift-marketplace/community-operators-vhvgf" Feb 17 10:12:47 crc kubenswrapper[4848]: I0217 10:12:47.822488 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnprf\" (UniqueName: \"kubernetes.io/projected/6f633976-d8e1-4f15-8c2f-5b08f3391017-kube-api-access-bnprf\") pod \"community-operators-vhvgf\" (UID: \"6f633976-d8e1-4f15-8c2f-5b08f3391017\") " pod="openshift-marketplace/community-operators-vhvgf" Feb 17 10:12:47 crc kubenswrapper[4848]: I0217 10:12:47.822879 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f633976-d8e1-4f15-8c2f-5b08f3391017-utilities\") pod \"community-operators-vhvgf\" (UID: \"6f633976-d8e1-4f15-8c2f-5b08f3391017\") " pod="openshift-marketplace/community-operators-vhvgf" Feb 17 10:12:47 crc kubenswrapper[4848]: I0217 10:12:47.925022 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f633976-d8e1-4f15-8c2f-5b08f3391017-catalog-content\") pod \"community-operators-vhvgf\" (UID: \"6f633976-d8e1-4f15-8c2f-5b08f3391017\") " pod="openshift-marketplace/community-operators-vhvgf" Feb 17 10:12:47 crc kubenswrapper[4848]: I0217 10:12:47.925104 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnprf\" (UniqueName: \"kubernetes.io/projected/6f633976-d8e1-4f15-8c2f-5b08f3391017-kube-api-access-bnprf\") pod \"community-operators-vhvgf\" (UID: \"6f633976-d8e1-4f15-8c2f-5b08f3391017\") " pod="openshift-marketplace/community-operators-vhvgf" Feb 17 10:12:47 crc kubenswrapper[4848]: I0217 10:12:47.925247 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f633976-d8e1-4f15-8c2f-5b08f3391017-utilities\") pod \"community-operators-vhvgf\" (UID: \"6f633976-d8e1-4f15-8c2f-5b08f3391017\") " pod="openshift-marketplace/community-operators-vhvgf" Feb 17 10:12:47 crc kubenswrapper[4848]: I0217 10:12:47.925501 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f633976-d8e1-4f15-8c2f-5b08f3391017-catalog-content\") pod \"community-operators-vhvgf\" (UID: \"6f633976-d8e1-4f15-8c2f-5b08f3391017\") " pod="openshift-marketplace/community-operators-vhvgf" Feb 17 10:12:47 crc kubenswrapper[4848]: I0217 10:12:47.925814 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f633976-d8e1-4f15-8c2f-5b08f3391017-utilities\") pod \"community-operators-vhvgf\" (UID: \"6f633976-d8e1-4f15-8c2f-5b08f3391017\") " pod="openshift-marketplace/community-operators-vhvgf" Feb 17 10:12:47 crc kubenswrapper[4848]: I0217 10:12:47.947215 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnprf\" (UniqueName: \"kubernetes.io/projected/6f633976-d8e1-4f15-8c2f-5b08f3391017-kube-api-access-bnprf\") pod \"community-operators-vhvgf\" (UID: \"6f633976-d8e1-4f15-8c2f-5b08f3391017\") " pod="openshift-marketplace/community-operators-vhvgf" Feb 17 10:12:48 crc kubenswrapper[4848]: I0217 10:12:48.102537 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vhvgf" Feb 17 10:12:48 crc kubenswrapper[4848]: I0217 10:12:48.632657 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vhvgf"] Feb 17 10:12:49 crc kubenswrapper[4848]: I0217 10:12:49.048056 4848 generic.go:334] "Generic (PLEG): container finished" podID="6f633976-d8e1-4f15-8c2f-5b08f3391017" containerID="cef28360f763f5417d83b48595608099d859a7acd72e6227c5dcf3984c20d6cd" exitCode=0 Feb 17 10:12:49 crc kubenswrapper[4848]: I0217 10:12:49.048111 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vhvgf" event={"ID":"6f633976-d8e1-4f15-8c2f-5b08f3391017","Type":"ContainerDied","Data":"cef28360f763f5417d83b48595608099d859a7acd72e6227c5dcf3984c20d6cd"} Feb 17 10:12:49 crc kubenswrapper[4848]: I0217 10:12:49.048140 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vhvgf" event={"ID":"6f633976-d8e1-4f15-8c2f-5b08f3391017","Type":"ContainerStarted","Data":"148b3cccb8db6234b87618f19497bcdd3f2614da3b245e55f59fb8033fb8fd14"} Feb 17 10:12:50 crc kubenswrapper[4848]: I0217 10:12:50.061228 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vhvgf" event={"ID":"6f633976-d8e1-4f15-8c2f-5b08f3391017","Type":"ContainerStarted","Data":"02b7bc50244c3fad001b59862892679ce6db5af69664c1dcc4e8009c7acbe8dd"} Feb 17 10:12:51 crc kubenswrapper[4848]: I0217 10:12:51.072634 4848 generic.go:334] "Generic (PLEG): container finished" podID="6f633976-d8e1-4f15-8c2f-5b08f3391017" containerID="02b7bc50244c3fad001b59862892679ce6db5af69664c1dcc4e8009c7acbe8dd" exitCode=0 Feb 17 10:12:51 crc kubenswrapper[4848]: I0217 10:12:51.072742 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vhvgf" event={"ID":"6f633976-d8e1-4f15-8c2f-5b08f3391017","Type":"ContainerDied","Data":"02b7bc50244c3fad001b59862892679ce6db5af69664c1dcc4e8009c7acbe8dd"} Feb 17 10:12:52 crc kubenswrapper[4848]: I0217 10:12:52.086310 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vhvgf" event={"ID":"6f633976-d8e1-4f15-8c2f-5b08f3391017","Type":"ContainerStarted","Data":"145c7e4a41b3081b615404dbfc946aaa050f7d1bf8b8a1d87d24b55c1b3b595c"} Feb 17 10:12:52 crc kubenswrapper[4848]: I0217 10:12:52.107570 4848 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vhvgf" podStartSLOduration=2.634741432 podStartE2EDuration="5.107552439s" podCreationTimestamp="2026-02-17 10:12:47 +0000 UTC" firstStartedPulling="2026-02-17 10:12:49.050004918 +0000 UTC m=+4046.593260564" lastFinishedPulling="2026-02-17 10:12:51.522815925 +0000 UTC m=+4049.066071571" observedRunningTime="2026-02-17 10:12:52.102527127 +0000 UTC m=+4049.645782773" watchObservedRunningTime="2026-02-17 10:12:52.107552439 +0000 UTC m=+4049.650808085" Feb 17 10:12:57 crc kubenswrapper[4848]: I0217 10:12:57.384160 4848 scope.go:117] "RemoveContainer" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" Feb 17 10:12:57 crc kubenswrapper[4848]: E0217 10:12:57.385134 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:12:58 crc kubenswrapper[4848]: I0217 10:12:58.102849 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vhvgf" Feb 17 10:12:58 crc kubenswrapper[4848]: I0217 10:12:58.103233 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vhvgf" Feb 17 10:12:58 crc kubenswrapper[4848]: I0217 10:12:58.171449 4848 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vhvgf" Feb 17 10:12:58 crc kubenswrapper[4848]: I0217 10:12:58.230944 4848 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vhvgf" Feb 17 10:12:58 crc kubenswrapper[4848]: I0217 10:12:58.411054 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vhvgf"] Feb 17 10:13:00 crc kubenswrapper[4848]: I0217 10:13:00.166151 4848 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vhvgf" podUID="6f633976-d8e1-4f15-8c2f-5b08f3391017" containerName="registry-server" containerID="cri-o://145c7e4a41b3081b615404dbfc946aaa050f7d1bf8b8a1d87d24b55c1b3b595c" gracePeriod=2 Feb 17 10:13:00 crc kubenswrapper[4848]: I0217 10:13:00.651280 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vhvgf" Feb 17 10:13:00 crc kubenswrapper[4848]: I0217 10:13:00.669703 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f633976-d8e1-4f15-8c2f-5b08f3391017-catalog-content\") pod \"6f633976-d8e1-4f15-8c2f-5b08f3391017\" (UID: \"6f633976-d8e1-4f15-8c2f-5b08f3391017\") " Feb 17 10:13:00 crc kubenswrapper[4848]: I0217 10:13:00.669802 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnprf\" (UniqueName: \"kubernetes.io/projected/6f633976-d8e1-4f15-8c2f-5b08f3391017-kube-api-access-bnprf\") pod \"6f633976-d8e1-4f15-8c2f-5b08f3391017\" (UID: \"6f633976-d8e1-4f15-8c2f-5b08f3391017\") " Feb 17 10:13:00 crc kubenswrapper[4848]: I0217 10:13:00.669913 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f633976-d8e1-4f15-8c2f-5b08f3391017-utilities\") pod \"6f633976-d8e1-4f15-8c2f-5b08f3391017\" (UID: \"6f633976-d8e1-4f15-8c2f-5b08f3391017\") " Feb 17 10:13:00 crc kubenswrapper[4848]: I0217 10:13:00.671538 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f633976-d8e1-4f15-8c2f-5b08f3391017-utilities" (OuterVolumeSpecName: "utilities") pod "6f633976-d8e1-4f15-8c2f-5b08f3391017" (UID: "6f633976-d8e1-4f15-8c2f-5b08f3391017"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 10:13:00 crc kubenswrapper[4848]: I0217 10:13:00.677086 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f633976-d8e1-4f15-8c2f-5b08f3391017-kube-api-access-bnprf" (OuterVolumeSpecName: "kube-api-access-bnprf") pod "6f633976-d8e1-4f15-8c2f-5b08f3391017" (UID: "6f633976-d8e1-4f15-8c2f-5b08f3391017"). InnerVolumeSpecName "kube-api-access-bnprf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 10:13:00 crc kubenswrapper[4848]: I0217 10:13:00.761612 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f633976-d8e1-4f15-8c2f-5b08f3391017-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f633976-d8e1-4f15-8c2f-5b08f3391017" (UID: "6f633976-d8e1-4f15-8c2f-5b08f3391017"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 17 10:13:00 crc kubenswrapper[4848]: I0217 10:13:00.772207 4848 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f633976-d8e1-4f15-8c2f-5b08f3391017-utilities\") on node \"crc\" DevicePath \"\"" Feb 17 10:13:00 crc kubenswrapper[4848]: I0217 10:13:00.772231 4848 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f633976-d8e1-4f15-8c2f-5b08f3391017-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 17 10:13:00 crc kubenswrapper[4848]: I0217 10:13:00.772242 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnprf\" (UniqueName: \"kubernetes.io/projected/6f633976-d8e1-4f15-8c2f-5b08f3391017-kube-api-access-bnprf\") on node \"crc\" DevicePath \"\"" Feb 17 10:13:01 crc kubenswrapper[4848]: I0217 10:13:01.176593 4848 generic.go:334] "Generic (PLEG): container finished" podID="6f633976-d8e1-4f15-8c2f-5b08f3391017" containerID="145c7e4a41b3081b615404dbfc946aaa050f7d1bf8b8a1d87d24b55c1b3b595c" exitCode=0 Feb 17 10:13:01 crc kubenswrapper[4848]: I0217 10:13:01.176635 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vhvgf" event={"ID":"6f633976-d8e1-4f15-8c2f-5b08f3391017","Type":"ContainerDied","Data":"145c7e4a41b3081b615404dbfc946aaa050f7d1bf8b8a1d87d24b55c1b3b595c"} Feb 17 10:13:01 crc kubenswrapper[4848]: I0217 10:13:01.176686 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vhvgf" event={"ID":"6f633976-d8e1-4f15-8c2f-5b08f3391017","Type":"ContainerDied","Data":"148b3cccb8db6234b87618f19497bcdd3f2614da3b245e55f59fb8033fb8fd14"} Feb 17 10:13:01 crc kubenswrapper[4848]: I0217 10:13:01.176689 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vhvgf" Feb 17 10:13:01 crc kubenswrapper[4848]: I0217 10:13:01.176705 4848 scope.go:117] "RemoveContainer" containerID="145c7e4a41b3081b615404dbfc946aaa050f7d1bf8b8a1d87d24b55c1b3b595c" Feb 17 10:13:01 crc kubenswrapper[4848]: I0217 10:13:01.212694 4848 scope.go:117] "RemoveContainer" containerID="02b7bc50244c3fad001b59862892679ce6db5af69664c1dcc4e8009c7acbe8dd" Feb 17 10:13:01 crc kubenswrapper[4848]: I0217 10:13:01.218342 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vhvgf"] Feb 17 10:13:01 crc kubenswrapper[4848]: I0217 10:13:01.228095 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vhvgf"] Feb 17 10:13:01 crc kubenswrapper[4848]: I0217 10:13:01.399210 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f633976-d8e1-4f15-8c2f-5b08f3391017" path="/var/lib/kubelet/pods/6f633976-d8e1-4f15-8c2f-5b08f3391017/volumes" Feb 17 10:13:01 crc kubenswrapper[4848]: I0217 10:13:01.602171 4848 scope.go:117] "RemoveContainer" containerID="cef28360f763f5417d83b48595608099d859a7acd72e6227c5dcf3984c20d6cd" Feb 17 10:13:01 crc kubenswrapper[4848]: I0217 10:13:01.658931 4848 scope.go:117] "RemoveContainer" containerID="145c7e4a41b3081b615404dbfc946aaa050f7d1bf8b8a1d87d24b55c1b3b595c" Feb 17 10:13:01 crc kubenswrapper[4848]: E0217 10:13:01.660223 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"145c7e4a41b3081b615404dbfc946aaa050f7d1bf8b8a1d87d24b55c1b3b595c\": container with ID starting with 145c7e4a41b3081b615404dbfc946aaa050f7d1bf8b8a1d87d24b55c1b3b595c not found: ID does not exist" containerID="145c7e4a41b3081b615404dbfc946aaa050f7d1bf8b8a1d87d24b55c1b3b595c" Feb 17 10:13:01 crc kubenswrapper[4848]: I0217 10:13:01.660264 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"145c7e4a41b3081b615404dbfc946aaa050f7d1bf8b8a1d87d24b55c1b3b595c"} err="failed to get container status \"145c7e4a41b3081b615404dbfc946aaa050f7d1bf8b8a1d87d24b55c1b3b595c\": rpc error: code = NotFound desc = could not find container \"145c7e4a41b3081b615404dbfc946aaa050f7d1bf8b8a1d87d24b55c1b3b595c\": container with ID starting with 145c7e4a41b3081b615404dbfc946aaa050f7d1bf8b8a1d87d24b55c1b3b595c not found: ID does not exist" Feb 17 10:13:01 crc kubenswrapper[4848]: I0217 10:13:01.660291 4848 scope.go:117] "RemoveContainer" containerID="02b7bc50244c3fad001b59862892679ce6db5af69664c1dcc4e8009c7acbe8dd" Feb 17 10:13:01 crc kubenswrapper[4848]: E0217 10:13:01.661438 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02b7bc50244c3fad001b59862892679ce6db5af69664c1dcc4e8009c7acbe8dd\": container with ID starting with 02b7bc50244c3fad001b59862892679ce6db5af69664c1dcc4e8009c7acbe8dd not found: ID does not exist" containerID="02b7bc50244c3fad001b59862892679ce6db5af69664c1dcc4e8009c7acbe8dd" Feb 17 10:13:01 crc kubenswrapper[4848]: I0217 10:13:01.661517 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02b7bc50244c3fad001b59862892679ce6db5af69664c1dcc4e8009c7acbe8dd"} err="failed to get container status \"02b7bc50244c3fad001b59862892679ce6db5af69664c1dcc4e8009c7acbe8dd\": rpc error: code = NotFound desc = could not find container \"02b7bc50244c3fad001b59862892679ce6db5af69664c1dcc4e8009c7acbe8dd\": container with ID starting with 02b7bc50244c3fad001b59862892679ce6db5af69664c1dcc4e8009c7acbe8dd not found: ID does not exist" Feb 17 10:13:01 crc kubenswrapper[4848]: I0217 10:13:01.661578 4848 scope.go:117] "RemoveContainer" containerID="cef28360f763f5417d83b48595608099d859a7acd72e6227c5dcf3984c20d6cd" Feb 17 10:13:01 crc kubenswrapper[4848]: E0217 10:13:01.661878 4848 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cef28360f763f5417d83b48595608099d859a7acd72e6227c5dcf3984c20d6cd\": container with ID starting with cef28360f763f5417d83b48595608099d859a7acd72e6227c5dcf3984c20d6cd not found: ID does not exist" containerID="cef28360f763f5417d83b48595608099d859a7acd72e6227c5dcf3984c20d6cd" Feb 17 10:13:01 crc kubenswrapper[4848]: I0217 10:13:01.661902 4848 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cef28360f763f5417d83b48595608099d859a7acd72e6227c5dcf3984c20d6cd"} err="failed to get container status \"cef28360f763f5417d83b48595608099d859a7acd72e6227c5dcf3984c20d6cd\": rpc error: code = NotFound desc = could not find container \"cef28360f763f5417d83b48595608099d859a7acd72e6227c5dcf3984c20d6cd\": container with ID starting with cef28360f763f5417d83b48595608099d859a7acd72e6227c5dcf3984c20d6cd not found: ID does not exist" Feb 17 10:13:10 crc kubenswrapper[4848]: I0217 10:13:10.383849 4848 scope.go:117] "RemoveContainer" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" Feb 17 10:13:10 crc kubenswrapper[4848]: E0217 10:13:10.384905 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:13:21 crc kubenswrapper[4848]: I0217 10:13:21.384140 4848 scope.go:117] "RemoveContainer" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" Feb 17 10:13:21 crc kubenswrapper[4848]: E0217 10:13:21.385069 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:13:35 crc kubenswrapper[4848]: I0217 10:13:35.384091 4848 scope.go:117] "RemoveContainer" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" Feb 17 10:13:35 crc kubenswrapper[4848]: E0217 10:13:35.385364 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:13:46 crc kubenswrapper[4848]: I0217 10:13:46.383968 4848 scope.go:117] "RemoveContainer" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" Feb 17 10:13:46 crc kubenswrapper[4848]: E0217 10:13:46.385175 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:14:00 crc kubenswrapper[4848]: I0217 10:14:00.383928 4848 scope.go:117] "RemoveContainer" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" Feb 17 10:14:00 crc kubenswrapper[4848]: E0217 10:14:00.384950 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:14:11 crc kubenswrapper[4848]: I0217 10:14:11.383702 4848 scope.go:117] "RemoveContainer" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" Feb 17 10:14:11 crc kubenswrapper[4848]: E0217 10:14:11.384509 4848 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-stvnz_openshift-machine-config-operator(7c28fed4-873d-42f6-ae63-03d12a425d0a)\"" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" podUID="7c28fed4-873d-42f6-ae63-03d12a425d0a" Feb 17 10:14:23 crc kubenswrapper[4848]: I0217 10:14:23.401557 4848 scope.go:117] "RemoveContainer" containerID="c932d792615dceebbb48a39c1de378d1d244ebf4dd39405745a8b9cd59cc80ab" Feb 17 10:14:23 crc kubenswrapper[4848]: I0217 10:14:23.975381 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-stvnz" event={"ID":"7c28fed4-873d-42f6-ae63-03d12a425d0a","Type":"ContainerStarted","Data":"571abbc4113fd708601376da11350e683c1a7af73709e3966bff4d07f2eae446"} Feb 17 10:15:00 crc kubenswrapper[4848]: I0217 10:15:00.185204 4848 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522055-ppx6x"] Feb 17 10:15:00 crc kubenswrapper[4848]: E0217 10:15:00.186181 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f633976-d8e1-4f15-8c2f-5b08f3391017" containerName="extract-content" Feb 17 10:15:00 crc kubenswrapper[4848]: I0217 10:15:00.186199 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f633976-d8e1-4f15-8c2f-5b08f3391017" containerName="extract-content" Feb 17 10:15:00 crc kubenswrapper[4848]: E0217 10:15:00.186225 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f633976-d8e1-4f15-8c2f-5b08f3391017" containerName="registry-server" Feb 17 10:15:00 crc kubenswrapper[4848]: I0217 10:15:00.186232 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f633976-d8e1-4f15-8c2f-5b08f3391017" containerName="registry-server" Feb 17 10:15:00 crc kubenswrapper[4848]: E0217 10:15:00.186266 4848 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f633976-d8e1-4f15-8c2f-5b08f3391017" containerName="extract-utilities" Feb 17 10:15:00 crc kubenswrapper[4848]: I0217 10:15:00.186275 4848 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f633976-d8e1-4f15-8c2f-5b08f3391017" containerName="extract-utilities" Feb 17 10:15:00 crc kubenswrapper[4848]: I0217 10:15:00.186493 4848 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f633976-d8e1-4f15-8c2f-5b08f3391017" containerName="registry-server" Feb 17 10:15:00 crc kubenswrapper[4848]: I0217 10:15:00.191135 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522055-ppx6x" Feb 17 10:15:00 crc kubenswrapper[4848]: I0217 10:15:00.193857 4848 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 17 10:15:00 crc kubenswrapper[4848]: I0217 10:15:00.194476 4848 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 17 10:15:00 crc kubenswrapper[4848]: I0217 10:15:00.195081 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522055-ppx6x"] Feb 17 10:15:00 crc kubenswrapper[4848]: I0217 10:15:00.327208 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdpvf\" (UniqueName: \"kubernetes.io/projected/dfc1a619-c2bd-4a24-8604-be760f48d5cf-kube-api-access-fdpvf\") pod \"collect-profiles-29522055-ppx6x\" (UID: \"dfc1a619-c2bd-4a24-8604-be760f48d5cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522055-ppx6x" Feb 17 10:15:00 crc kubenswrapper[4848]: I0217 10:15:00.327519 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dfc1a619-c2bd-4a24-8604-be760f48d5cf-secret-volume\") pod \"collect-profiles-29522055-ppx6x\" (UID: \"dfc1a619-c2bd-4a24-8604-be760f48d5cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522055-ppx6x" Feb 17 10:15:00 crc kubenswrapper[4848]: I0217 10:15:00.327826 4848 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dfc1a619-c2bd-4a24-8604-be760f48d5cf-config-volume\") pod \"collect-profiles-29522055-ppx6x\" (UID: \"dfc1a619-c2bd-4a24-8604-be760f48d5cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522055-ppx6x" Feb 17 10:15:00 crc kubenswrapper[4848]: I0217 10:15:00.430216 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdpvf\" (UniqueName: \"kubernetes.io/projected/dfc1a619-c2bd-4a24-8604-be760f48d5cf-kube-api-access-fdpvf\") pod \"collect-profiles-29522055-ppx6x\" (UID: \"dfc1a619-c2bd-4a24-8604-be760f48d5cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522055-ppx6x" Feb 17 10:15:00 crc kubenswrapper[4848]: I0217 10:15:00.430387 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dfc1a619-c2bd-4a24-8604-be760f48d5cf-secret-volume\") pod \"collect-profiles-29522055-ppx6x\" (UID: \"dfc1a619-c2bd-4a24-8604-be760f48d5cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522055-ppx6x" Feb 17 10:15:00 crc kubenswrapper[4848]: I0217 10:15:00.430511 4848 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dfc1a619-c2bd-4a24-8604-be760f48d5cf-config-volume\") pod \"collect-profiles-29522055-ppx6x\" (UID: \"dfc1a619-c2bd-4a24-8604-be760f48d5cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522055-ppx6x" Feb 17 10:15:00 crc kubenswrapper[4848]: I0217 10:15:00.432482 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dfc1a619-c2bd-4a24-8604-be760f48d5cf-config-volume\") pod \"collect-profiles-29522055-ppx6x\" (UID: \"dfc1a619-c2bd-4a24-8604-be760f48d5cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522055-ppx6x" Feb 17 10:15:00 crc kubenswrapper[4848]: I0217 10:15:00.444944 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dfc1a619-c2bd-4a24-8604-be760f48d5cf-secret-volume\") pod \"collect-profiles-29522055-ppx6x\" (UID: \"dfc1a619-c2bd-4a24-8604-be760f48d5cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522055-ppx6x" Feb 17 10:15:00 crc kubenswrapper[4848]: I0217 10:15:00.451444 4848 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdpvf\" (UniqueName: \"kubernetes.io/projected/dfc1a619-c2bd-4a24-8604-be760f48d5cf-kube-api-access-fdpvf\") pod \"collect-profiles-29522055-ppx6x\" (UID: \"dfc1a619-c2bd-4a24-8604-be760f48d5cf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29522055-ppx6x" Feb 17 10:15:00 crc kubenswrapper[4848]: I0217 10:15:00.515498 4848 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522055-ppx6x" Feb 17 10:15:00 crc kubenswrapper[4848]: I0217 10:15:00.997271 4848 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522055-ppx6x"] Feb 17 10:15:01 crc kubenswrapper[4848]: W0217 10:15:01.002549 4848 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfc1a619_c2bd_4a24_8604_be760f48d5cf.slice/crio-5b45f5d6ea0a5456ab454bafc278858dcc619f868d7c7aad8c71631ed04d31f1 WatchSource:0}: Error finding container 5b45f5d6ea0a5456ab454bafc278858dcc619f868d7c7aad8c71631ed04d31f1: Status 404 returned error can't find the container with id 5b45f5d6ea0a5456ab454bafc278858dcc619f868d7c7aad8c71631ed04d31f1 Feb 17 10:15:01 crc kubenswrapper[4848]: I0217 10:15:01.383429 4848 generic.go:334] "Generic (PLEG): container finished" podID="dfc1a619-c2bd-4a24-8604-be760f48d5cf" containerID="b26a3f31d99540f670f4dc0b3dd07a4e9aa69ca6c8172a0a521f5d2acba96c6a" exitCode=0 Feb 17 10:15:01 crc kubenswrapper[4848]: I0217 10:15:01.393088 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522055-ppx6x" event={"ID":"dfc1a619-c2bd-4a24-8604-be760f48d5cf","Type":"ContainerDied","Data":"b26a3f31d99540f670f4dc0b3dd07a4e9aa69ca6c8172a0a521f5d2acba96c6a"} Feb 17 10:15:01 crc kubenswrapper[4848]: I0217 10:15:01.393129 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522055-ppx6x" event={"ID":"dfc1a619-c2bd-4a24-8604-be760f48d5cf","Type":"ContainerStarted","Data":"5b45f5d6ea0a5456ab454bafc278858dcc619f868d7c7aad8c71631ed04d31f1"} Feb 17 10:15:02 crc kubenswrapper[4848]: I0217 10:15:02.741168 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522055-ppx6x" Feb 17 10:15:02 crc kubenswrapper[4848]: I0217 10:15:02.877977 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dfc1a619-c2bd-4a24-8604-be760f48d5cf-secret-volume\") pod \"dfc1a619-c2bd-4a24-8604-be760f48d5cf\" (UID: \"dfc1a619-c2bd-4a24-8604-be760f48d5cf\") " Feb 17 10:15:02 crc kubenswrapper[4848]: I0217 10:15:02.878071 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdpvf\" (UniqueName: \"kubernetes.io/projected/dfc1a619-c2bd-4a24-8604-be760f48d5cf-kube-api-access-fdpvf\") pod \"dfc1a619-c2bd-4a24-8604-be760f48d5cf\" (UID: \"dfc1a619-c2bd-4a24-8604-be760f48d5cf\") " Feb 17 10:15:02 crc kubenswrapper[4848]: I0217 10:15:02.878166 4848 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dfc1a619-c2bd-4a24-8604-be760f48d5cf-config-volume\") pod \"dfc1a619-c2bd-4a24-8604-be760f48d5cf\" (UID: \"dfc1a619-c2bd-4a24-8604-be760f48d5cf\") " Feb 17 10:15:02 crc kubenswrapper[4848]: I0217 10:15:02.879049 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dfc1a619-c2bd-4a24-8604-be760f48d5cf-config-volume" (OuterVolumeSpecName: "config-volume") pod "dfc1a619-c2bd-4a24-8604-be760f48d5cf" (UID: "dfc1a619-c2bd-4a24-8604-be760f48d5cf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 17 10:15:02 crc kubenswrapper[4848]: I0217 10:15:02.885939 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfc1a619-c2bd-4a24-8604-be760f48d5cf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dfc1a619-c2bd-4a24-8604-be760f48d5cf" (UID: "dfc1a619-c2bd-4a24-8604-be760f48d5cf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 17 10:15:02 crc kubenswrapper[4848]: I0217 10:15:02.894956 4848 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfc1a619-c2bd-4a24-8604-be760f48d5cf-kube-api-access-fdpvf" (OuterVolumeSpecName: "kube-api-access-fdpvf") pod "dfc1a619-c2bd-4a24-8604-be760f48d5cf" (UID: "dfc1a619-c2bd-4a24-8604-be760f48d5cf"). InnerVolumeSpecName "kube-api-access-fdpvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 17 10:15:02 crc kubenswrapper[4848]: I0217 10:15:02.981300 4848 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdpvf\" (UniqueName: \"kubernetes.io/projected/dfc1a619-c2bd-4a24-8604-be760f48d5cf-kube-api-access-fdpvf\") on node \"crc\" DevicePath \"\"" Feb 17 10:15:02 crc kubenswrapper[4848]: I0217 10:15:02.981352 4848 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dfc1a619-c2bd-4a24-8604-be760f48d5cf-config-volume\") on node \"crc\" DevicePath \"\"" Feb 17 10:15:02 crc kubenswrapper[4848]: I0217 10:15:02.981371 4848 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dfc1a619-c2bd-4a24-8604-be760f48d5cf-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 17 10:15:03 crc kubenswrapper[4848]: I0217 10:15:03.409592 4848 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29522055-ppx6x" event={"ID":"dfc1a619-c2bd-4a24-8604-be760f48d5cf","Type":"ContainerDied","Data":"5b45f5d6ea0a5456ab454bafc278858dcc619f868d7c7aad8c71631ed04d31f1"} Feb 17 10:15:03 crc kubenswrapper[4848]: I0217 10:15:03.409921 4848 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b45f5d6ea0a5456ab454bafc278858dcc619f868d7c7aad8c71631ed04d31f1" Feb 17 10:15:03 crc kubenswrapper[4848]: I0217 10:15:03.410896 4848 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29522055-ppx6x" Feb 17 10:15:03 crc kubenswrapper[4848]: I0217 10:15:03.819056 4848 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522010-fmh2z"] Feb 17 10:15:03 crc kubenswrapper[4848]: I0217 10:15:03.830088 4848 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29522010-fmh2z"] Feb 17 10:15:05 crc kubenswrapper[4848]: I0217 10:15:05.397474 4848 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e13e335f-75a3-43a6-9bd8-7ea62595fa13" path="/var/lib/kubelet/pods/e13e335f-75a3-43a6-9bd8-7ea62595fa13/volumes"